Indiana University scientists pioneer oculomics research with NIH support

Indiana University scientists with expertise in optometry and artificial intelligence are among the first three groups of researchers to receive support from a new program from the National Institutes of Health supporting the emerging field of "oculomics," which uses the eye as a lens on diseases that affect the whole body.

Stephen A. Burns, a professor at the IU School of Optometry, has been named a principal investigator on a three-year, $4.8 million award from the NIH Venture Program Oculomics Initiative. Among the co-investigators on the award is Eleftherios Garyfallidis, an associate professor of intelligent systems engineering at the IU Luddy School of Informatics, Computing and Engineering.

The project will support the development of next-generation ophthalmoscopes -; instruments to observe the interior of the eye -; that can spot the early warning signs of conditions such as diabetes, heart disease, kidney disease, sickle cell anemia and Alzheimer's disease with a simple eye scan.

"This research is about using the eye as a window on health," Burns said, noting that the retina is the only directly observable part of the central nervous system. "We want to give health care providers the clearest view they can hope to get into the body, non-invasively."

Additional researchers on the project include co-principal investigator Amani Fawzi of Northwestern University and co-investigators Alfredo Dubra of Stanford University and Toco Y. P. Chui of the New York Eye and Ear Infirmary of Mount Sinai.

Burns' research on using the eye to detect disease goes back to the early 2000s, when he and colleagues at the IU School of Optometry pioneered applying adaptive optics scanning laser systems to the observation of the human eye. The field was originally developed by astronomers to eliminate the "twinkle" of stars -; or distortions cause by the Earth's atmosphere -; in telescopes. The optics of the eye produce similar light distortions.

Using the technology developed at the school, the ophthalmoscope in Burns' lab can observe the back of the human eye at the resolution of two microns -; a scale small enough to show the real-time movement of red blood cells inside the eye's arteries and veins. (A single red blood cell is approximately eight microns in width.) Burns has used the technology to identify biomarkers for diabetes and hypertension in the walls of the eye's blood vessels.

Project researchers from Northwestern and Mount Sinai have used similar technology to observe the cells both outside and inside these blood vessels, including spotting the crescent-shaped red blood cells found in sickle cell anemia. The Stanford researchers have used adaptive optics to improve observation of the eye's photoreceptors.

With support from the NIH, the research teams will integrate their individual projects into a singular device, as well as apply state-of-the-art machine learning and AI. Additionally, they will explore the technology's potential to spot the early signs of heart disease and Alzheimer's disease.

There's growing evidence of a strong retinal vascular component to Alzheimer's disease. You can currently see the signs with PET scans, which require large, multimillion-dollar instruments. If we can see the same signs with an eye scan, it's a lot less invasive and a lot less costly."

Stephen A. Burns, Professor, IU School of Optometry

Garyfallidis' role is developing and applying the machine learning and AI methods for interpreting the devices' results. This could reduce diagnosis time from days to minutes by eliminating the need for a human to analyze the imagery.

In the first year of the project, the labs will align their instruments to the same level of sensitivity, said Burns, whose lab will integrate its technology with Northwestern's instrument. Stanford will focus on similar technological integrations with the instrument at New York Eye and Ear.

Next, the work will shift toward data validation to confirm that the new instruments' readings align with earlier versions of the technology. The researcher will also compare the new AI system's interpretation of scans against the conclusions of human analysts to confirm accuracy.

The final year of the project will involve testing the device on clinical volunteers. Much of IU's data will come from individuals recruited through the Atwater Eye Care Center.

"Up to 80 percent of the population over the age of 60 has at least one health issue that may be detectable in the eye with our technology," Burns said.

He said the NIH selected the project due to potential for high impact. Venture Fund initiatives emphasize "brief, modest investments that can be implemented quickly, with a strong potential to accelerate science."

"Our challenge now is selectivity and specificity," Burns said. "We need to show that we can detect the differences between conditions, to quickly and accurately interpret the signs of the various diseases we're focusing on."

The goal is to advance the technology until it is ready to make the leap from the lab to "wherever you get your annual eye exam," he said.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Study reveals the role of eye movements in Parkinson's disease