NEI highlights new technologies and tools to help people living with low vision or blindness

During Low Vision Awareness Month, the National Eye Institute (NEI), part of the National Institutes of Health, is highlighting new technologies and tools in the works to help the 4.1 million Americans living with low vision or blindness. The innovations aim to help people with vision loss more easily accomplish daily tasks, from navigating office buildings to crossing a street. Many of the innovations take advantage of computer vision, a technology that enables computers to recognize and interpret the complex assortment of images, objects and behaviors in the surrounding environment.

Low vision means that even with glasses, contact lenses, medicine, or surgery, people find everyday tasks difficult to do. It can affect many aspects of life, from walking in crowded places to reading or preparing a meal, explained Cheri Wiggs, Ph.D., program director for low vision and blindness rehabilitation at the NEI. The tools needed to stay engaged in everyday activities vary based on the degree and type of vision loss. For example, glaucoma causes loss of peripheral vision, which can make walking or driving difficult. By contrast, age-related macular degeneration affects central vision, creating difficulty with tasks such as reading, she said.

Here's a look at a few NEI-funded technologies under development that aim to lessen the impact of low vision and blindness.

Co-robotic cane
Navigating indoors can be especially challenging for people with low vision or blindness. While existing GPS-based assistive devices can guide someone to a general location such as a building, GPS isn't much help in finding specific rooms, said Cang Ye, Ph.D., of the University of Arkansas at Little Rock. Ye has developed a co-robotic cane that provides feedback on a user's surrounding environment.

Ye's prototype cane has a computerized 3-D camera to "see" on behalf of the user. It also has a motorized roller tip that can propel the cane toward a desired location, allowing the user to follow the cane's direction. Along the way, the user can speak into a microphone and a speech recognition system interprets verbal commands and guides the user via a wireless earpiece. The cane's credit card-sized computer stores pre-loaded floor plans. However, Ye envisions being able to download floor plans via Wi-Fi upon entering a building. The computer analyzes 3-D information in real time and alerts the user of hallways and stairs. The cane gauges a person's location in the building by measuring the camera's movement using a computer vision method. That method extracts details from a current image captured by the camera and matches them with those from the previous image, thus determining the user's location by comparing the progressively changing views, all relative to a starting point. In addition to receiving NEI support, Ye recently was awarded a grant from the NIH's Coulter College Commercializing Innovation Program to explore commercialization of the robotic cane.

Robotic glove finds door handles, small objects
In the process of developing the co-robotic cane, Ye realized that closed doorways pose yet another challenge for people with low vision and blindness. "Finding the door knob or handle and getting the door open slows you way down," he said. To help someone with low vision locate and grasp small objects more quickly, he designed a fingerless glove device.

On the back surface is a camera and a speech recognition system, enabling the user to give the glove voice commands such as "door handle," "mug," "bowl," or "bottle of water." The glove guides the user's hand via tactile prompts to the desired object. "Guiding the person's hand left or right is easy," Ye said. "An actuator on the thumb's surface takes care of that in a very intuitive and natural way." Prompting a user to move his or her hand forward and backward, and getting a feel for how to grasp an object, is more challenging.

Ye's colleague Yantao Shen, Ph.D., University of Nevada, Reno, developed a novel hybrid tactile system that comprises an array of cylindrical pins that send either a mechanical or electrical stimulus. The electric stimulus provides an electrotactile sensation, meaning that it excites the nerves on the skin of the hand to simulate a sense of touch. Picture four cylindrical pins in alignment down the length of your index finger. One by one, starting with the pin closest to your finger tip, the pins pulse in a pattern indicating that the hand should move backward.

The reverse pattern indicates the need for forward motion. Meanwhile, a larger electrotactile system on the palm uses a series of cylindrical pins to create a 3-D representation of the object's shape. For example, if your hand is approaching the handle of a mug, you would sense the handle's shape in your palm so that you could adjust the position of your hand accordingly. As your hand moves toward the mug handle, any slight shifts in angle are noted by the camera and the tactile sensation on your palm reflects such changes.

Smartphone crosswalk app
Street crossings can be especially dangerous for people with low vision. James Coughlan, Ph.D., and his colleagues at the Smith-Kettlewell Eye Research Institute have developed a smartphone app that gives auditory prompts to help users identify the safest crossing location and stay within the crosswalk.

The app harnesses three technologies and triangulates them. A global positioning system (GPS) is used to pinpoint the intersection where a user is standing. Computer vision is then used to scan the area for crosswalks and walk lights. That information is integrated with a geographic information system (GIS) database containing a crowdsourced, detailed inventory about an intersection's quirks, such as the presence of road construction or uneven pavement. The three technologies compensate for each other's weaknesses. For example, while computer vision may lack the depth perception needed to detect a median in the center of the road, such local knowledge would be included in the GIS template. And while GPS can adequately localize the user to an intersection, it cannot identify on which corner a user is standing. Computer vision determines the corner, as well as where the user is in relation to the crosswalk, the status of the walk lights and traffic lights, and the presence of vehicles.

CamIO system helps explore objects in a natural way
Imagine a system that enables visually impaired biology students to explore a 3-D anatomical model of a heart by touching an area and hearing "aortic arch" in response. The same system could also be used to get an auditory readout of the display on a device such as a glucose monitor. The prototype system, designed with a low-cost camera connected to a laptop computer, can make physical objects - from 2-D maps to digital displays on microwaves - fully accessible to users with low vision or blindness.

The CamIO (short for camera input-output), also under development by Coughlan, provides real-time audio feedback as the user explores an object in a natural way, turning it around and touching it. Holding a finger stationary on 3-D or 2-D objects, signals the system to provide an audible label of the location in question or an enhanced image on a laptop screen. CamIO was conceived by Joshua Miele, Ph.D, a blind scientist at Smith-Kettlewell who develops and evaluates novel sound/touch interfaces to help people with vision loss. Coughlan plans to develop a smartphone app version of CamIO. In the meantime, software for the laptop version will be available for free download. To watch a demonstration of the CamIO system, visit http://bit.ly/2CamIO.

High-powered prisms, periscopes for severe tunnel vision
People with retinitis pigmentosa and glaucoma can lose most of their peripheral vision, making it challenging to walk in crowded places like airports or malls. People with severe peripheral field vision loss can have a residual central island of vision that's as little as 1 to 2 percent of their full visual field. Eli Peli, O.D., of Schepens Eye Research Institute, Boston, has developed lenses constructed of many adjacent one-millimeter wide prisms that expand the visual field while preserving central vision. Peli designed a high-powered prism, called a multiplexing prism that expands one's field of view by about 30 degrees. "That's an improvement, but it's not good enough," explained Peli.

In a study, he and his colleagues mathematically modeled people walking in crowded places and found that the risk of collision is highest when other pedestrians are approaching from a 45-degree angle. To reach that degree of peripheral vision, he and his colleagues are employing a periscope-like concept. Periscopes, such as those used to see the ocean surface from a submarine, rely on a pair of parallel mirrors that shift an image, providing a view that would otherwise be out of sight. Applying a similar concept, but with non-parallel mirrors, Peli and colleagues have developed a prototype that achieves a 45-degree visual field. Their next step is to work with optical labs to manufacture a cosmetically acceptable prototype that can be mounted into a pair of glasses. "It would be ideal if we could design magnetic clip-ons spectacles that could be easily mounted and removed," he said.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Researchers boost natural defenses to fight cataracts and delay the need for surgery