How Being Exposed to Alexa, Siri Can Impact Children’s Empathy, Cognition
Artificial intelligence-powered voice assistants occupy every corner. The likes of Alexa and Siri help children fall asleep at night by regaling them with bedtime stories. They encourage children to brush their teeth and even assist in potty-training them. In other words, voice assistants from Google, Amazon, and Apple have taken on quite a bit of parenting responsibilities. However, there may be an unseen cost to the ease Alexa and Siri have brought into our lives.
This takes us back to the age-old debate: when does technology stop being a boon, and morph into a bane? According to a recent study, we’re about to find out rather soon — at least, in terms of the impact of voice assistants on developing minds.
Published in the journal Archives of Disease in Childhood, the study suggests that being exposed to — and obsessively reliant on — voice assistants from a young age, can have long-term effects on people’s minds. Constant interactions with the programmed, mechanized voices can adversely impact children’s social and cognitive development — specifically interfering with their ability to empathize and show compassion, besides inhibiting their critical thinking skills.
The link to social development is threatened because the interaction with technology is almost always devoid of any stimulation and critical thought. According to co-author Anmol Arora from the School of Clinical Medicine at Cambridge University, the voice assistants that children grow up humanizing, communicate through nothing but “a list of trained words and sounds mashed together to make a sentence.”
Even adult conversations with Alexa or Siri are ridden with misunderstanding and mindless reiterations. When the same process plays out at a much more formative age, the lack of thought in communication can almost impair one’s cognitive ability. Arora adds, “These devices don’t understand what they’re saying. All they’re doing is regurgitating some information in response to a narrow query, which it might have misunderstood anyway, without any real understanding of safety or who’s listening to it… The multiple impacts on children include inappropriate responses, impeding social development, and hindering learning opportunities.”
Related on The Swaddle:
Arora’s concerns about safety aren’t unfounded. Last December, upon being asked for a challenge, Alexa directed a 10-year-old to use a coin to touch the metal prongs of an electric plug that was half-inserted into a live socket. Alexa’s advice, basically, endangered the child’s life. Not only do voice assistants shape children’s understanding of human communication, but also pose risks to safety and security within the house.
Citing this incident, Arora further commented, “If a child is particularly young, they might well not be able to pronounce particular words properly, and then there’s a risk their words might be misinterpreted and they’re exposed to something inappropriate.”
What further adds to anxieties about safety is a process dubbed “eavesmining” — a portmanteau of the terms eavesdropping and data-mining — that voice assistants often indulge in. They’re “always listening;” in course of that, what they essentially do is record everything being said by children. Given that children aren’t in a position to give informed consent to be surveilled round the clock, and the context in which the recordings could potentially be used isn’t immediately clear either, Arora’s trepidations do warrant unease.
Moreover, at a point in time when we’re yet to fully grasp the extent to which our social skills have been influenced by pandemic-triggered isolations, voice assistants could, perhaps, worsen the problem. Speaking to Daily Mail, Arora noted, “While in normal human interactions, a child would usually receive constructive feedback if they were to behave inappropriately, this is beyond the scope of a smart device… [It’s] particularly important at a time when children might already have had social development impaired as a result of Covid restrictions and when [they] might have been spending more time isolated with smart devices at home.”
A.I.-powered voice assistants could also perniciously alter children’s scope for empathy and compassion. “As a result, children are going to be learning very narrow forms of questioning and always in the form of a demand,” noted Arora.
Moreover, since much of the interaction plays out in an authoritative, superior manner, the process can instill the idea of subservience and power dynamics. This is more worrying given voice assistants carry a female voice more often than not, further reinforcing ideas about gender-based societal roles in young children.
Related on The Swaddle:
However, not everyone agrees that voice assistants are a cause for immediate concern. According to Amy Orben, an experimental psychologist at Cambridge University, who wasn’t involved in the study, it is possible that the paper may be magnifying the risk. “[I]ts argument rests largely on news reports and anecdotal evidence… Scientifically, little is known about the impact of voice assistants on children… The impacts of voice assistants are probably mixed and very dependent on how they are used by children,” Orben opined.
Further, concerns about children emulating the voice assistants to an extent that they don’t learn to modify their “tone, volume, emphasis, or intonation,” appears to have an undertone that champions neurotypical communication and sees any divergence from it in an unfavorable light. For instance, autistic people are often mocked for their to-the-point, concise “robot-like speech” that may or may not involve stock phrases.
Who’s to say that exposure to voice assistants from a young age might make people more receptive to any non-neurotypical communication? Not only that, communicating with voice assistants may reduce the need to mask among neurodivergent children.
The critique of the recent study isn’t to imply that voice assistant technology is extremely beneficial for one set of people and absolutely detrimental for another. Nor does it indicate that every criticism of the way voice assistants negatively impact children’s social and cognitive skills is ableist. What it, instead, puts forth is the lack of inclusive research that doesn’t deem non-neurotypical social skills as a negative, but also fails to include the ways in which voice assistance technology can mindfully engage with children.
As Caroline Fitzpatrick, the Canada Research Chair in Digital Media Use by Children and Its Implications for Promoting Togetherness: An Ecosystemic Approach, said, “[A]s long as parents keep to the recommended limits for children, and they’re getting a healthy amount of interaction from their caregivers and peers, then I don’t think there should be cause for alarm.”
Until further research, then, the takeaway from the present study appears to be: too much of anything — and especially, exposure to voice assistants — is bad.