Kids’ Brains Absorb 1.5 MB of Data Just to Master a Native Language
That’s not even a decent pen drive, but it’s bigger than you think.
Here’s a fun fact for the next trivia night your frenemy coworker drags you to: in order to master their native language, kids’ brains absorb 1.5 megabytes of language data between birth and age 18.
Kids may not be worth the pen drive on which you keep all of your photos of them, but that’s not quite the point of the finding; the study has great implications for the development of AI that is able to converse on par with a human.
“Ours is the first study to put a number on the amount you have to learn to acquire language,” said Steven Piantadosi, PhD, an assistant professor of psychology at the University of California, Berkeley. Piantadosi is the senior author of the study, published in the Royal Society Open Science journal, that quantified language development as bits — that is, binary digits — of information.
While it may seem small, compared to today’s handheld computing power, it actually reflects an enormous rate of human processing in the first 18 years of life: 1,000 bits of language information each day — in addition, of course, to the thousands and thousands of bits of information kids are absorbing about everything else. (The researchers define one byte as eight bits.)
“When you think about a child having to remember millions of zeroes and ones [in language], that says they must have really pretty impressive learning mechanisms,” Piantadosi said.
It also highlights why scientists have yet to develop a conversant AI that sounds natural, he adds.
“Machines know what words go together and where they go in sentences, but know very little about the meaning of words,” Piantadosi said.
Related on The Swaddle:
What Happens When You Ask Siri Questions About Sex
Kids, on the other hand, are not only absorbing what words go together and where, but the contextual information about words and phrases that allows speakers to actually use them and understand what others say. For instance, learning and using the word “apple” requires kids also to learn and understand that an apple is for eating (1 bit); that an apple is a fruit (1 bit); that an apple is not a vegetable (1 bit); that an apple has seeds (1 bit); that an apple is red (1 bit); that red is a color (1 bit); etc. It adds up.
Researchers have not yet found a good way of easily adapting this intuitive human process of learning for AI, when it comes to language. Some, like Fei-Fei Li, co-director of the Stanford Human-Centered AI (HAI) Institute, have attempted to replicate it. In her earlier capacity as the director of the university’s AI Lab, Li built a database of images, each tagged by a human with as many relevant descriptors as possible. For instance, an image of a dog riding a skateboard would have tags like: “Dog has fluffy, wavy fur,” “Road is cracked,” etc., reported Will Knight for Quartz in 2016. “But the analogy with human learning goes only so far,” Knight wrote. “Young children do not need to see a skateboarding dog to be able to imagine or verbally describe one.”
While the total amount of bits kids hold in their heads just to speak a sentence in their native tongue might not do much other than prompt wonder, it’s a finding that may help researchers develop more interactive AI — so both the kids and Alexa can sass back.
Liesl Goecker is The Swaddle's managing editor.