Feb 27 2017
Research has proven the importance of early access to sound and spoken language among newborns and has led to significant advances in hearing screening and early intervention. Despite progress and improvements in educational and language outcomes of deaf children, children with hearing loss are still delayed, on average, when it comes to spoken language acquisition and still achieve lower reading levels and educational outcomes than children with normal hearing.
Researchers from The Ohio State University Wexner Medical Center have launched a study seeking to understand how deaf infants with cochlear implants absorb information and learn novel words during interactions with their parents, in an effort to help improve parental guidance with language development.
"Our research uses a new methodology developed by colleagues at IU-Bloomington and adds high-tech sensing and computing technology to the traditional behavioral methodology of recording infant-parent interactions to investigate their reciprocal roles in language acquisition and cognitive development," said Derek Houston, lead investigator and associate professor of otolaryngology at the Buckeye Center for Hearing and Development at Ohio State Wexner Medical Center.
There are two specific aims to the study. First, researchers investigate the role of deafness and subsequent cochlear implantation on infant-parent communicative interactions and related word learning by collecting communicative interaction data from deaf infants before and after cochlear implantation and also from age-matched children with normal hearing. Both group interactions are analyzed across several sessions and changes are documented.
Secondly, investigators evaluate whether deaf children with cochlear implants benefit from similar cues for word learning as children with normal hearing. They're collecting and analyzing communicative interaction data and conduct assessments of novel word learning among deaf infants with 12 -18 months of cochlear implant experience as well as age-matched controls.
During the audio-recorded sessions, the infant and parent wear head-mounted cameras with eye-tracking devices to precisely document where the child's focus is as the parent presents a toy with an unusual name. From six different angles, the technology records the child's reaction when a parent says a new word and researchers review the footage for patterns and signs of word recognition.
"The innovative technology allows for sophisticated methods of integrating, analyzing and data-mining multimodal data from all of the cameras, eye trackers and microphone to perform micro-level behavioral analyses of interactive events, such as the rate at which the infant and parent look at the same object at the same time, also known as coordinated attention," Houston said.
Houston and team say they're discovering how hearing loss affects that dynamic interaction with the parent, and how those effects impact the child's general cognitive and language development. They hope to extend their research to other clinical populations as well, such as those with attention-deficit/hyperactivity disorder and autism spectrum disorder.