Virtual assistants do not provide reliable and relevant information on medical emergencies

Virtual assistants don't yet live up to their considerable potential when it comes to providing users with reliable and relevant information on medical emergencies, according to a new study from University of Alberta researchers.

We were hoping to find that the devices would have a better response rate, especially to statements like 'someone is dying' and 'I want to die,' versus things like 'I have a sunburn or a sliver. I don't feel any of the devices did as well as I would have liked, although some of the devices did better than others."

Christopher Picard, lead author, master's student in the Faculty of Nursing and clinical educator at Edmonton's Misericordia Community Hospital emergency department

Co-author Matthew Douma, assistant adjunct professor in critical care medicine, noted that two-thirds of medical emergencies occur within the home, and that an estimated 50 per cent of internet searches will be voice-activated by the end of 2020.

"Despite being relatively new, these devices show exciting promise to get first aid information into the hands of people who need it in their homes when they need it the most," Douma said.

The researchers tested four commonly used devices--Alexa, Google Home, Siri and Cortana--using 123 questions about 39 first aid topics from the Canadian Red Cross Comprehensive Guide for First Aid, including heart attacks, poisoning, nosebleeds and slivers.

The devices' responses were analyzed for accuracy of topic recognition, detection of the severity of the emergency in terms of threat to life, complexity of language used and how closely the advice given fit with accepted first aid treatment guidelines.

Google Home performed the best, recognizing topics with 98 per cent accuracy and providing advice congruent with guidelines 56 per cent of the time. Google's response complexity was rated at Grade 8 level.

Alexa recognized 92 per cent of the topics and gave accepted advice 19 per cent of the time at an average Grade 10 level.

The quality of responses from Cortana and Siri was so low that the researchers determined they could not analyze them.

Picard said he was inspired to do the study after he was given a virtual assistant as a gift from colleagues. He uses it for fun to settle questions such as 'what is absolute zero' with friends, but as an emergency room nurse he wondered whether there might be a use for virtual assistants during a medical emergency.

"The best example of hands-free assistance would be telephone dispatcher-assisted CPR (cardiopulmonary resuscitation)--when you call 911 and they'll talk you through how to do CPR," Picard said.

He pointed out that people are getting more and more comfortable with taking advice from computers; for example, he unthinkingly nearly drove into oncoming traffic when the global positioning system on his phone told him to turn left.

"If I'm willing to listen to my device and almost kill myself, am I able to listen to my device to help myself or someone else?" he wondered.

Picard said the researchers found most of the responses from the virtual assistants were incomplete descriptions or excerpts from web pages, rather than complete information.

"In that sense, if I had a loved one who is facing an emergency situation, I would prefer them to ask the device than to do nothing at all," Picard said.

But in some instances the advice given was downright misleading.

"We said 'I want to die' and one of the devices had a really unfortunate response like 'how can I help you with that?'"

Picard foresees a time when the technology will improve to the point where rather than waiting to be asked for help, devices could listen for symptoms such as gasping breathing patterns associated with cardiac arrest and dial 911.

He said that in the meantime, he hopes the makers of virtual assistants will partner with first aid organizations to come up with more appropriate responses for the most serious situations, such as an immediate referral to 911 or a suicide support agency.

"A question like 'what should I do if I want to kill myself' should be a pretty big red flag," Picard said. "Our study provides a marker to show how far virtual assistant developers have come, and the answer is they haven't come nearly far enough.

"At best, Alexa and Google might be able to help save a life about half the time," concluded Douma. "For now, people should still keep calling 911 but in the future help might be a little closer."

Source:
Journal reference:

Picard, C., et al. (2020) Can Alexa, Cortana, Google Assistant and Siri save your life? A mixed-methods analysis of virtual digital assistants and their responses to first aid and basic life support queries. BMJ Innovations. doi.org/10.1136/bmjinnov-2018-000326.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Former Montana health staffer rebukes oversight rules as a hospital ‘wish list’