University of Miami psychologist finds patterns of nonverbal emotional communication between infants and mothers to help develop a baby robot that learns
To help unravel the mysteries of human cognitive development and reach new the frontiers in robotics, University of Miami (UM) developmental psychologists and computer scientists from the University of California in San Diego (UC San Diego) are studying infant-mother interactions and working to implement their findings in a baby robot capable of learning social skills.
The first phase of the project was studying face-to-face interactions between mother and child, to learn how predictable early communication is, and to understand what babies need to act intentionally. The findings are published in the current issue of the journal Neural Networks in a study titled "Applying machine learning to infant interaction: The development is in the details."
The scientists examined 13 mothers and babies between 1 and 6 months of age, while they played during five minute intervals weekly. There were approximately 14 sessions per dyad. The laboratory sessions were videotaped and the researchers applied an interdisciplinary approach to understanding their behavior.
The researchers found that in the first six months of life, babies develop turn- taking skills, the first step to more complex human interactions. According to the study, babies and mothers find a pattern in their play, and that pattern becomes more stable and predictable with age,explains Daniel Messinger, associate professor of Psychology in the UM College of Arts and Sciences and principal investigator of the study.
"As babies get older, they develop a pattern with their moms," says Messinger. "When the baby smiles, the mom smiles; then the baby stops smiling and the mom stops smiling, and the babies learn to expect that someone will respond to them in a particular manner," he says. "Eventually the baby also learns to respond to the mom."
The next phase of the project is to use the findings to program a baby robot, with basic social skills and with the ability to learn more complicated interactions. The robot's name is Diego-San. He is 1.3 meters tall and modeled after a 1-year-old child. The construction of the robot was a joint venture between Kokoro Dreams and the Machine Perception Laboratory at UC San Diego.
The robot will need to shift its gaze from people to objects based on the same principles babies seem to use as they play and develop. "One important finding here is that infants are most likely to shift their gaze, if they are the last ones to do so during the interaction," says Messinger. "What matters most is how long a baby looks at something, not what they are looking at."
The process comes full circle. The babies teach the researchers how to program the robot, and in training the robot the researchers get insight into the process of human behavior development, explains Paul Ruvolo, six year graduate student in the Computer Science Department at UC San Diego and co-author of the study.
"A unique aspect of this project is that we have state-of-the-art tools to study development on both the robotics and developmental psychology side," says Ruvolo. "On the robotics side we have a robot that mechanically closely approximates the complexity of the human motor system and on the developmental psychology side we have a fine-grained motion capture and video recording that shows the mother infant action in great detail," he says. "It is the interplay of these two methods for studying the process of development that has us so excited."
Ultimately, the baby robot will give scientists understanding on what motivates a baby to communicate and will help answer questions about the development of human learning. This study is funded by National Science Foundation.