Mayo Clinic study explores patient privacy in MRI research

Though identifying data typically are removed from medical image files before they are shared for research, a Mayo Clinic study finds that this may not be enough to protect patient privacy.

At Mayo Clinic, we take patient privacy as a core value. We are studying potential gaps in deidentification as we seek ways to improve these techniques."

Christopher Schwarz, Ph.D., Mayo Clinic researcher and computer scientist in the Center for Advanced Imaging Research, and the study's lead author

The study, described in a letter published in the New England Journal of Medicine, finds that it's possible to use commercial facial recognition software to identify people from brain MRI that includes imagery of the face, despite steps that researchers typically take to protect patient privacy. This is a potential issue for study participants who share brain imaging data. It is not related to patient care, and it is not limited or specific to studies at Mayo Clinic.

"This is only applicable if people can get access to the MRI scans in publicly available research databases. It is not related to medical care, where data is secured," Dr. Schwarz says.

The researchers add that this risk is only applicable to people whose imaging data has been released into the public domain through their participation in research studies, and typically before researchers can gain access to the data they are required to sign a data-use agreement in which they state they will not try to identify the participants.

Today's standard when sharing MRI scans for research is to remove identifiers such as name and identification number. But imagery of the face included in MRI remains accessible. Software programs to remove or blur faces in MRIs have been available for many years, but they haven't been widely used because they can degrade researchers' ability to automatically measure brain structures from the images, according to Dr. Schwarz. Even when used, the software may not fully prevent reidentification of the patient.

To determine whether facial recognition software could identify people from an MRI, researchers recruited 84 volunteers who had an existing brain MRI from within the past three months and then took additional photographs. Researchers then created facial reconstruction images from each MRI and attempted to match these images to the photographs using publicly available facial recognition software.

For 70 of 84 participants, the correct MRI image was chosen as the software's No. 1 match for those participants' photographs, an 83% success rate. In 80 of 84 cases, the correct MRI image was among the top five possible matches for participants' photos, a 95% success rate.

"Our study's 83% match rate suggests that facial recognition presents a possible means to reidentify research participants from their cranial MRIs," according to the researchers. This could mean a breach of associated health information, including diagnoses, genetic data and results of other imaging.

Institutions typically only share imaging data with researchers who legally commit not to attempt to identify participants. Still, "We understand there are concerns about the negative impact of facial recognition technology on personal privacy," says Clifford Jack, M.D., a Mayo Clinic radiologist and senior author, who also is a member of National Academy of Medicine. Dr. Jack is the Alexander Family Professor of Alzheimer's Disease Research.

"Dr. Schwarz's work points out that these concerns include the possibility of identifying individual research participants who have been guaranteed anonymity as a condition of their participation in medical research. This is an issue that the medical research community must be aware of and address."

The Mayo team plans to publish another manuscript detailing their novel potential solution and how it improves on existing privacy protection efforts.

"We are making good progress toward an initial solution," says Dr. Schwarz. "Making data private and keeping it private is an always-evolving field. The insights we gained in this study will help us in our work to keep patient data private and use it more effectively for research into diseases and potential new therapies."

"With advances in digital technologies, in this case facial recognition software, it's critical that we continue to revisit the promises that we've made to our patients, particularly promises related to the confidentiality of their medical data," says Richard Sharp, Ph.D., Lloyd A. and Barbara A. Amundson Professor of Biomedical Ethics and director of the Biomedical Ethics Research Program, who was involved in this research. "Much of our work in biomedical ethics focuses on protecting patients from unanticipated harms and this is an excellent illustration of the importance of that work."

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
New research explores how omega-3 and omega-6 fatty acids may impact cancer rates