Jun 16 2017
A mixed reality system which allows medical practitioners to view and interact with virtual replicas of patients' organs, bones or body parts is being developed by academics.
Researchers at Birmingham City University's Digital Media Technology Lab (DMT Lab) are devising the system which enables users to interact with virtual models and patient data using freehand movements.
The system allows users to manipulate, navigate and demonstrate patient data using hand motions and gestures, so that practitioners can showcase medical procedures, lifestyle choices, and treatment effects using customized 3D virtual models and patients' real medical records.
It could be used to visually demonstrate medical problems, the areas where surgery will be conducted, improvements which could be made following treatment or the damage caused by harmful addictive substances such as tobacco.
The technology uses motion detecting sensors combined with the DMT Lab expertise in freehand interaction in Mixed Reality to create a more realistic experience in virtual environments and bridge the gap between users and technology.
Dr. Ian Williams said: "We are developing this system as a platform to allow medical professionals to interact with genuine patient data and manipulate it by hand to educate and inform patients.
"The real advantages this brings are being able to visually demonstrate parts of the anatomy, using virtual models which can be customized for each patient and show how they have been impacted by lifestyle choices or how they may be changed following treatments or surgery."
Staff at the University will be discussing the system and other Mixed Reality projects during an Open Day to be held on Saturday (June 10) at its City Centre Campus.
In the future the system will be upgraded to replicate injuries, mobility problems or illnesses and show changes which could be made through lifestyle choices or medical procedures.
It could also help the practitioners by creating a new way to view patient data in an array of settings.
Medical practitioners could be able to showcase medical procedures and treatment effects on customized medical models.
Surgeons would also be able to interact with images of patients' bodies to view and manipulate during procedures without the need to remove their scrubs and gloves in sterilized environments.
The use of customized models and the interactive environment that can be shared with the patient, can help to boost patients' engagement into treatments and their understanding.