Researchers explore patient's perceptions of AI in radiology

AI-powered MRIs could cut the potentially toxic dyes used in traditional MRIs by up to 80 percent, but will patients embrace AI-driven brain scans?

Researchers explore patient
Dr. Farkhondeh (Ferry) Hassandoust is a senior lecturer in information systems at the University of Auckland Business School. Image Credit: University of Auckland

Getting a brain scan is not only a nerve-racking, claustrophobic experience, it also involves the use of chemical agents like gadolinium-based contrast dyes.

While these dyes enhance the clarity of MRI images, they can cause toxicity in the body.

Recent advances in artificial intelligence (AI) are poised to reduce the reliance on gadolinium-based agents, and AI-driven MRI scans might soon offer a safer alternative for patients.

In light of these cutting-edge options, researchers from Australia, New Zealand and France are exploring patient perceptions of AI in radiology, particularly brain MRIs.

Saeed Akhlaghpour and Javad Pool from the University of Queensland, Farkhondeh Hassandoust (University of Auckland) and Roxana Ologeanu-Taddei (TBS Education), surveyed 619 participants to uncover the factors influencing people's acceptance of AI in MRI scans.

Dr Hassandoust says as AI begins to match and, in some instances, surpass human capabilities in tasks such as image analysis, the subtleties of AI implementation need to be explored.

“We became interested in patient's perceptions of AI in radiology after learning about a Sydney-based startup, DeepMeds, which uses AI to generate MRI images with significantly less contrast dye.

“Learning about what they were doing inspired our study," she says.

"We wanted to know if patients would accept and trust AI imaging tools over more tried and tested methods. We were keen to find out how their understanding of the technology, including how it works, as well as any risks, benefits and other features, might influence their openness to AI-driven MRIs."

Their findings highlight the importance of transparency and communication - specifically, the concept of AI explainability.

‘Explainable AI’ is artificial intelligence that's programmed to describe its purpose, rationale and decision-making process in a way that the average person can understand. In the context of MRI scans, this might mean showing how an AI system analyses an image and arrives at a particular diagnosis.

This feature not only helps radiologists understand the AI's reasoning process, but also empowers patients by making the diagnostic process transparent and less daunting." 

Dr. Farkhondeh (Ferry) Hassandoust, University of Auckland

The study shows that explainability plays a pivotal role in building trust, regardless of a patient's health condition. Whether facing a cancer diagnosis or seeking answers for a minor issue like sinus congestion, participants preferred AI systems that could demystify their recommendations.

One respondent noted the inconsistency of human diagnoses: “I happen to know, already, that readings of various scans (MRI, CT, x-ray) are already rather unreliable. Show the same film to 100 radiologists, get at least 10 different answers... What's just a shadow to one is definitely something to worry about to another…”.

In contrast, explainable AI was seen as safer, more consistent and accurate.

One participant said: “I would choose it over the traditional MRI because of the dye they put in your body. I've had it injected into my body and it's not a good experience. It makes me get very flushed and hot, then I just feel bad afterwards. Then they tell you to drink a lot of liquids to get the stuff out of your body.”

Participants also highlighted the potential benefits of AI-driven MRIs and barriers such as insurance coverage. One of the participants said: “I feel with the new AI technology I would be getting the best treatment that was fully detailed and thorough throughout my MRI.”

Another said: “I think it could save me money if problems are detected earlier. I'm always concerned about a health problem getting too expensive. It would also potentially spare me from some side effects of traditional MRI.”

The use of AI in healthcare, particularly radiology, extends beyond MRIs. In 2023, nearly 80 percent of AI-enabled medical applications approved by the FDA were in radiology, a field Hassandoust says is well-suited to AI's strengths in pattern recognition and image enhancement.

“Unlike what we call ‘black box’ systems, like ChatGPT, which don’t explain how they work, explainable AI can help patients, clinicians and radiologists better understand and gain confidence in these emerging technologies.

“These tools can enhance diagnostic precision, address workforce shortages and reduce healthcare costs.”

Source:

University of Auckland

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
New AI method enhances breast cancer risk prediction from mammograms