Skip to main content

Our patients have spoken: keep radiologists in the centre of AI imaging ecosystems

The Original Article was published on 08 November 2019

In European Radiology, Ongena and colleagues [1] have developed a standardised questionnaire to evaluate the patient perspective on the implementation of artificial intelligence in radiology. In doing so, the authors have endeavoured to address an important blind spot in AI research, namely a need to assess the impact of new technologies in their social, cultural and political milieu [2].

Using exploratory factor analysis on patient feedback, the authors identified five variables reflecting patient concerns in radiology AI—(1) trust and accountability, (2) understanding of acquisition procedure and interpretation, (3) human communication, (4) efficiency and (5) being informed of AI utilisation for their radiological diagnosis. To paraphrase the results factoring scores for each variable, patients want accurate and fast results they can understand and believe, whilst being provided opportunities when being given their results to clarify their queries and doubts, as well as receive emotional support. These are fair expectations, and whilst not completely achievable at present, should remain aspirational for AI developers and radiologists alike.

Ongena et al. [1] found that patients are moderately negative when it comes to their trust in AI taking over diagnostic interpretations tasks of the radiologist with regard to accuracy, communication and confidentiality. This bodes well for AI systems which are designed to be human-computer augmentative “centaur” systems, and it is imperative to effectively leverage the respective strengths of both machine intelligence and human ingenuity [3]. Medicine is, and continues to be, a “high touch” profession, and it has been shown that good patient–doctor communication and empathy are more important to patient satisfaction than medical outcome [4]. Radiology AI solutions often fail to reflect the true complexities of clinical practice, which are compounded by multiple data sources and their temporal variances [5, 6]. Quite apart from ambiguity of scientific evidence, there are persistent public suspicions of AI arising from the intrinsic mystery of the AI black box [7, 8]. Whilst professionals struggle to explain results from an AI algorithm, expectations of public trust should be realistic and patient.

The paper found a patient preference for AI to look at all parts of the body, and for AI to do more opportunistic imaging and predictive analysis. Perhaps patients have drawn parallels with our day-to-day lives, where annual certification inspection for vehicle roadworthiness is commonplace, so we may see the development of routine AI screening whole-body scans. Whole-body MRI (WB-MRI) is already recommended in international guidelines for the assessment of several cancer-prone syndromes [9] and it is not much of a stretch of imagination that it may come to pass in healthcare as emphasis left-shifts towards preserving the health span, rather than treating disease. Furthermore, state-of-the-art liquid biopsies which promise to detect cancer in the early stage by the isolation of circulating tumor cells and tumor DNA [10] provide a rich source of genomic and proteomic data which can be combined to augment multimodal AI screening for cancer. Whilst this is not an endorsement of this strategy as a management plan for public health—mainly due to the enormous cost inefficiencies—should the cost and availability of this technology improve above a threshold of feasibility, it remains an attractive proposition.

Lastly, the study reveals that educational level of attainment plays a role in how patients perceive AI technology. This flies in the face of conventional wisdom that age, or the lack thereof, is the main determinant of the receptiveness of a generation to new technologies. Despite the fact that younger generations were born as digital natives, it cannot be assumed that an implicit trust has been created between them and AI machines.

Similarly, cultural factors are likely to play a large role in the acceptance of AI technology amongst patients, and is an avenue for future research. It has been shown that the Japanese people are more comfortable with robots when compared with their American counterparts, in part due to the Japanese Shinto practice which embraces animism [11].

All said, the life of the artificially intelligent imaging ecosystem is in its infancy. Given the magnitude of its presence in the field of radiology in this new 4th industrial age, its societal impact needs to be widely reassessed once the technology is more mature. Nonetheless, asking the right questions of our patients, is the proper place to start.

"If I had an hour to solve a problem and my life depended on the solution, I would spend the first 55 minutes determining the proper question to ask, for once I know the proper question, I could solve the problem in less than five minutes." - Ascribed to Albert Einstein (1879–1955), Nobel Laureate Theoretical Physicist

References

  1. Ongena YP, Haan M, Yakar D, Kwee TC (2019) Patients’ views on the implementation of artificial intelligence in radiology: development and validation of a standardized questionnaire. Eur Radiol. https://doi.org/10.1007/s00330-019-06486-0

  2. Crawford K, Calo R (2016) There is a blind spot in AI research. Nature 538:311–313. https://doi.org/10.1038/538311a

    CAS  Article  PubMed  Google Scholar 

  3. Liew CJ, Krishnaswamy P, Cheng LT, Tan CH, Poh AC, Lim TC (2019) Artificial intelligence and radiology in Singapore: championing a new age of augmented imaging for unsurpassed patient care. Ann Acad Med Singapore 48:16–24

  4. Heffring MP, Neilsen EJ, Szklarz MJ, Dobson GS (1986) High tech, high touch: common denominators in patient satisfaction. Hosp Health Serv Adm 31:81–93

    CAS  PubMed  Google Scholar 

  5. Cook TS (2019) Human versus machine in medicine: can scientific literature answer the question? Lancet Digital Health 1(6):e246–e247

    Article  Google Scholar 

  6. Savadjiev P, Chong J, Dohan A et al (2019) Demystification of AI-driven medical image interpretation: past, present and future. Eur Radiol 29(3):1616–1624

    Article  Google Scholar 

  7. Choy G, Khalilzadeh O, Michalski M et al (2018) Current applications and future impact of machine learning in radiology. Radiology. 288(2):318–328

    Article  Google Scholar 

  8. Verghese A, Shah NH, Harrington RA (2018) What this computer needs is a physician: humanism and artificial intelligence. JAMA. 319(1):19–20

    Article  Google Scholar 

  9. Petralia G, Padhani AR, Pricolo P et al (2019) Whole-body magnetic resonance imaging (WB-MRI) in oncology: recommendations and key uses. Radiol Med 124:218–233. https://doi.org/10.1007/s11547-018-0955-7

    Article  PubMed  Google Scholar 

  10. Palmirotta R, Lovero D, Cafforio P et al (2018) Liquid biopsy of cancer: a multimodal diagnostic tool in clinical oncology. Ther Adv Med Oncol 10:175883591879463. https://doi.org/10.1177/1758835918794630

    Article  Google Scholar 

  11. Mims C (2010) Why Japanese love robots (and Americans fear them). MIT Technology Review. Available via http://www.technologyreview.com/view/421187/why-japanese-love-robots-and-americans-fear-them/. Accessed 29 Sep 2019

Download references

Funding

The authors state that this work has not received any funding.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Charlene Liew.

Ethics declarations

Guarantor

The scientific guarantor of this publication is Dr Charlene Liew.

Conflict of interest

The authors of this manuscript declare no relationships with any companies, whose products or services may be related to the subject matter of the article.

Statistics and biometry

No complex statistical methods were necessary for this paper.

Informed consent

Not applicable

Ethical approval

Institutional Review Board approval was not required because this is an editorial comment.

Methodology

• Not applicable

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This comment refers to the article available at https://doi.org/10.1007/s00330-019-06486-0.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Liew, C., Lim, C.Y. Our patients have spoken: keep radiologists in the centre of AI imaging ecosystems. Eur Radiol 30, 1031–1032 (2020). https://doi.org/10.1007/s00330-019-06531-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00330-019-06531-y