Skip to main content

Babies’ Brains Are Primed for Their Native Language Before Birth

Black mother talking to newborn
Newborn babies’ brain waves are in tune with the language they were exposed to most often in utero. | AdobeStock

Human babies pick up language at an exceptional pace during their first year of life, but it has mostly been a mystery whether exposure to language before birth primes their brains to acquire a specific language. Now, new research in Science Advances suggests that newborn babies' brain waves are in tune with the language they were exposed to most often in utero.

"These results provide the most compelling evidence to date that language experience already shapes the functional organization of the infant brain, even before birth," the authors write.

Although most newborns are considered "universal listeners" — equipped to learn any possible human language — by their first birthday, babies' brains become specialized for the sounds of their native language. While this first year is pivotal for language development, research suggests that prenatal experience may also help lay the groundwork for auditory and speech perception.

Between five and seven months of gestation, a fetus can begin to hear sounds outside of the womb. Just days after birth, infants have shown that they prefer their mother's voice and native language. Newborns can also recognize rhythms and melodies heard in utero, and prenatal exposure to music may help them to develop musical abilities. But it has been unclear whether the same can be said for language.

Now, Benedetta Mariani, Ph.D. student at the Padova Neuroscience Center at the University of Padova, and colleagues have found that sleeping babies who were most recently exposed to their mother's native language exhibited brain signals associated with long-term speech and language learning.

Measuring Baby Brain Waves

The researchers recruited 33 native French-speaking expectant mothers from the maternity ward of Robert Debré Hospital in Paris, where they used a technique called encephalography (EEG) to monitor the brain waves of their babies between one and five days after birth.

"In adults, we know that a series of neural oscillations or brain waves play a role in understanding speech and language," said coauthor Judit Gervain, professor at the department of developmental and social psychology at the University of Padua and senior research scientist at the Integrative Neuroscience and Cognition Center, CNRS and Université Paris Cité. "Waves oscillating at different frequencies align with the rhythms of different units in speech, such as the syllable or individual speech sounds."

In this experiment and in prior work, the researchers used EEG to identify whether this brain architecture, present in adults with much more language experience, was already present to some degree in the newborn brain — and if so, whether the rhythms their brains produce could already align with the rhythms of the language they heard most often in the womb.

As the babies slept, the researchers played French, Spanish and English language versions of the children's fairytale "Goldilocks and the Three Bears" in various orders, each set starting and ending with three minutes of silence, which is when they recorded the babies' brain waves. The babies were outfitted with caps containing 10 active electrodes, placed in areas overlying brain regions associated with auditory and speech perception in infants.

The electrodes measured electrophysiological activity as frequency signals, which helped the researchers determine whether hearing these languages activated brain waves associated with processing different elements of speech — such as theta oscillations (4 to 8 Hertz), which are linked to hearing syllables, or gamma oscillations (30 to 60 Hertz), which are related to distinct units of sound known as phonemes.

"EEG is effective because it directly measures brain activity as a time scale, in milliseconds, that is necessary to detect the temporal dynamics of neural oscillations," Gervain explained. "EEG is fully non-invasive and well-tolerated, [even by] young infants."

The EEG signals were processed using a method that helps to measure the degree of 'memory' (long-range correlations) contained within them, Mariani explained. "In our case, this measure showed evidence of language learning, in other words lasting changes in the brain dynamics after exposure to language, specifically after the prenatally heard language," Mariani said.

Language Development in the First Years of Life

Do babies who miss out on this prenatal "language priming" — such as international adoptees or infants who were born deaf — suffer developmentally later in life? Gervain answered that this is not necessarily the case. Prenatal language experience scaffolds or supports language development, but does not determine developmental outcomes, she explained.

This study was part of a larger project led by Gervain to understand how language and speech perception develops prenatally as well as in the pivotal first couple of years of life, when development is shaped both by prenatal and postnatal experience.

"We are investigating and following up on infants at various ages to see whether and how these neural mechanisms support later language development," said Gervain. "[EEG] can be used across the life span, so we could use the same method and study design for the various ages we studied in the bigger project."

"This technique could certainly help us in the future to quantify how the learning abilities change with the baby's age, and which frequency bands are targeted by the language learning at different ages," Mariani added.

Gervain described how babies that are one day, six months or two years old exhibit different patterns of brain activity, as postnatal experience continues to fine-tune their neural oscillations. In the future, studying these patterns at different stages of life could help researchers better understand important language development milestones, such as learning words.

Author

Nyla Husain

Related Focus Areas