If you are a fan of wildlife shows, you’ve probably seen those tiny video cameras rigged to animals in the wild that provide a sneak peek into their secret domains. But not all research cams are mounted on creatures with fur, feathers, or fins. One of NIH’s 2014 Early Independence Award winners has developed a baby-friendly, head-mounted camera system (shown above) that captures the world from an infant’s perspective and explores one of our most human, but still imperfectly understood, traits: language.
Elika Bergelson, a young researcher at the University of Rochester in New York, wants to know exactly how and when infants acquire the ability to understand spoken words. Using innovative camera gear and other investigative tools, she hopes to refine current thinking about the natural timeline for language acquisition. Bergelson also hopes her work will pay off in a firmer theoretical foundation to help clinicians assess children with poor verbal skills or with neurodevelopmental conditions that impair information processing, such as autism spectrum disorders.
Already, Bergelson has made progress towards building that firmer foundation. In her doctoral work at the University of Pennsylvania, she and her advisor Daniel Swingley showed that infants begin understanding words about six months after birth . Until then, many researchers believed that babies were unable to shift their focus from sounds and syllables to the meaning of words until about 12 months of age. Using a laboratory-based system that tracked infants’ eye movements when they were asked to identify common objects on a computer screen, Bergelson and Swingley found that some 6-month-old babies could understand, to a certain degree, about a dozen nouns, such as “apple” or “hair.” This finding might explain why previous research has shown that children with hearing impairments did better when they were fitted with cochlear implants at 6 months old, rather than just a few months later .
Next, Bergelson plans to conduct a home-based study of approximately 50 babies to examine in finer detail how sights and sounds get soaked up—and eventually verbalized—by youngsters ages 6 to 18 months, a critical period for learning words. In the year-long study, called Study of Environmental Effects on Developing LINGuistic Skills (SEEDLingS), parent volunteers will place tiny cameras onto their babies’ heads for one hour each month and interact with their baby just like they usually do. In addition to this baby’s eye-view of the world, Bergelson will also record all of the audio input that the child is receiving from parents and other people, pets, or devices in the home on another day each month.
Bergelson will then review the home audiovisual recordings to glean clues into how seeing and hearing drives a baby’s ability to learn words. She also will ask parents to bring their children to the Rochester Baby Lab to participate in experiments that track their eye movements for familiar object-word pairs, such as “hand” and “bottle.” The aim is to test how children react to familiar and novel objects shown on a computer screen and their ability to pair them with known words and sounds.
As a quick footnote, Bergelson and Noah Simon, a fellow Early Independence Award winner at the University of Washington in Seattle, were recently honored in the science component of Forbes “30 Under 30,” an annual list of the most-important young innovators and influencers. Congratulations to them both!
 At 6-9 months, human infants know the meanings of many common nouns. Bergelson E, Swingley D. Proc Natl Acad Sci U S A. 2012 Feb 28;109(9):3253-8.
 Language of early- and later-identified children with hearing loss. Yoshinaga-Itano C, Sedey AL, Coulter DK, Mehl AL. Pediatrics. 1998 Nov;102(5):1161-71.
Elika Bergelson, University of Rochester, NY
Mechanisms of Word Learning in Infancy, Elika Bergelson, RePORT, NIH
SEEDLingS project, Rochester Baby Lab
NIH support: NIH Common Fund