The nature of general visual category representations for various object categories in 6- to 8-month-old infants

The human brain creates visual categories as a fundamental aspect of cognition. Infancy is a crucial period of life for such representations to be built up, but these are as yet unknown. A recent study shows how visual categories are represented in human infant brains using electroencephalography (EEG) and how these correspond to adult brain representations.

Study: Visual category representations in the infant brain. Image Credit: marikun/Shutterstock
Study: Visual category representations in the infant brain. Image Credit: marikun/Shutterstock

Introduction

Visual objects need to be recognized easily and instantaneously and assigned to the right category. This ability comes via the infant's exploration of the environment. Earlier, researchers have shown how looking times and attention markers in infants confirm the categorization and processing of such objects, contributing to cognitive learning before a baby completes a year of life.

This process has been examined in adults using human and non-human primates, including their changes over time, the cortical regions they inhabit, and their relationship with neural oscillations. There is a need to extend this to infants, using direct methods that can be generalized to a broader set of visual categories.

The current study, published in Current Biology, set out to do this. Here, the scientists performed multivariate analysis on a multivariate classification framework and compared infant and adult EEG quantitative data and deep learning models.

What did the study show?

The study included 40 infants shown 128 images of real-world objects in four categories  - toys, bodies, houses, and faces. The images were shown for two seconds each, at a speed of 2.7-2.9 seconds between images. The same images were shown to 20 adults to facilitate direct comparison.

The infants showed increasing classification from 100 msec onwards but became significant at ~250 ms. Peak classification occurred at ~416 ms, following which it slowly fell. This was observed across categories of objects, living or non-living.

With adults, the curve was first seen at 70 msec, peaked at 154 msec, and thus showed acceleration compared to infants. This could be due to the longer latent periods required even at early cortical processing in infancy, causing a delay in the P100 component peak. In addition, the mean ERP peak showed a delay of up to 242 msec.

This suggests that the observed peak latency differences with which category representations emerge reflect a mixture of processing delays at early and late processing stages."

Using a multivariate approach helps map visual category representations to behavior. The findings showed rapid increases and decrease in the classification curves over a few hundred milliseconds. This seems to be due to common neural pathways resulting in a quick series of processing steps that cause rapidly changing representations early on, followed by more sustained representations later on.

However, as with adults, the representations were classified most accurately along the diagonal timelines. In contrast, feedforward and feedback information was incompletely processed in the immature brains of infants. Of course, this could also be traceable to the divergences in experimental task design or the signal-to-noise ratio (SNR). However, this was less likely because of the focus on peaks.

Infants and adults share visual category representations, showing similarities at two time point combinations; in infants, this is at 160–540 msec, vs. 100–1,000 msec. The peak latency in infants was 200 msec, vs. 120 msec in adults. This was observed with alternative processing and data aggregation options, independent of most categories except toys.

Yet, the infant's brains shared large-scale dynamics over time, leading to the similar encoding of visual categories. Such representations included features of low to intermediate complexity, amounting to a subset of adult representations that are discriminated by features of all levels of complexity. Both less and more complex features are encoded in the high-level ventral visual cortex, with the category being discriminated against using different degrees of these features.

Spatial frequencies are better correlated with categories in adults than infants, showing that the former relies more on such frequencies. Moreover, the differences in representation in adults vs. children were observed despite similar EEG power spectra profiles or categories. Adults showed a broadband spectrum of neural oscillations during these representations, compared to the theta band being the distinctive infant signature for visual category representation.

Infants and adults share these representations only in the alpha/beta band. The theta rhythm is associated with neural networks for learning or memory in infants, helping form visual category representations. These are quickly processed in mature semantic networks in adults, explaining the alpha/beta band rhythms.

They suggest that the frequency shifts upwards slowly as the infant's brain develops, with the magnitude of the shift being in accordance with myelination. Other explanations have also been suggested, all of which indicate that the power in any given frequency is an indicator of oscillations in that frequency.

What are the implications?

"In sum, the emerging picture is one of not yet fully developed dynamics of adult-like visual category representations in infants." That is, infants developed representations later after exposure, at slower speeds, and did not have some aspects of feedforward and recurrent processing. This could be due to incomplete myelin development and poor synaptic connections formed in the infant's brain compared to the adult brain.

The approach used in this study extends previous EEG-based research into the development of visual processing in the brain due to direct representation with high temporal resolution. It also allows almost any visual category to be studied while providing for quantitative comparisons of infant and adult visual category representations.

The scientists predict that as the infant grows, such representations will probably speed up and emerge earlier due to better feedforward and feedback loops being formed at key stages. This can be easily tested using the same approach in other age groups. Such data could help understand infant cognitive capacities and development in the years to come.

This data could also mean that artificial intelligence researchers will meet limits as they develop deep learning modes since they must follow this developmental course to achieve a biologically realistic model.

These results show how fast and efficient visual categorization skills are achieved as humans grow, besides revealing changes over time in the lines of transmission of cortical information. Further research based on these powerful EEG techniques could throw more light on these areas.

Journal reference:
Dr. Liji Thomas

Written by

Dr. Liji Thomas

Dr. Liji Thomas is an OB-GYN, who graduated from the Government Medical College, University of Calicut, Kerala, in 2001. Liji practiced as a full-time consultant in obstetrics/gynecology in a private hospital for a few years following her graduation. She has counseled hundreds of patients facing issues from pregnancy-related problems and infertility, and has been in charge of over 2,000 deliveries, striving always to achieve a normal delivery rather than operative.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Thomas, Liji. (2023, January 09). The nature of general visual category representations for various object categories in 6- to 8-month-old infants. News-Medical. Retrieved on December 25, 2024 from https://www.news-medical.net/news/20230109/The-nature-of-general-visual-category-representations-for-various-object-categories-in-6-to-8-month-old-infants.aspx.

  • MLA

    Thomas, Liji. "The nature of general visual category representations for various object categories in 6- to 8-month-old infants". News-Medical. 25 December 2024. <https://www.news-medical.net/news/20230109/The-nature-of-general-visual-category-representations-for-various-object-categories-in-6-to-8-month-old-infants.aspx>.

  • Chicago

    Thomas, Liji. "The nature of general visual category representations for various object categories in 6- to 8-month-old infants". News-Medical. https://www.news-medical.net/news/20230109/The-nature-of-general-visual-category-representations-for-various-object-categories-in-6-to-8-month-old-infants.aspx. (accessed December 25, 2024).

  • Harvard

    Thomas, Liji. 2023. The nature of general visual category representations for various object categories in 6- to 8-month-old infants. News-Medical, viewed 25 December 2024, https://www.news-medical.net/news/20230109/The-nature-of-general-visual-category-representations-for-various-object-categories-in-6-to-8-month-old-infants.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Space experiment shows faster maturation of brain organoids in microgravity