Do toys that ‘listen’ steal children’s privacy?

Parents report privacy concerns about “smart” toys, like Hello Barbie and CogniToys Dino, that record the voices of children who interact with them and store those recordings in the cloud, say researchers.

These toys, which connect to the internet, can joke around with children and respond in surprising detail to questions posed by their young users. The research also reveals that kids are usually unaware that the toys are actually recording their conversations.

“These toys that can record and transmit are coming into a place that’s historically legally very well-protected―the home,” says co-lead author Emily McReynolds, associate director of the Tech Policy Lab at the University of Washington. “People have different perspectives about their own privacy, but it’s crystalized when you give a toy to a child.”

Though internet-connected toys have taken off commercially, their growth in the market has not been without security breaches and public scrutiny. VTech, a company that produces tablets for children, was storing personal data of more than 200,000 children when its database was hacked in 2015. Earlier this year, Germany banned the Cayla toy over fears that personal data could be stolen.

How just 13 DNA snippets could identify you

It’s within this landscape that researchers set out to understand the privacy concerns and expectations kids and parents have for these types of toys.

Telling secrets

They conducted interviews with nine parent-child pairs, asking each of them questions―including whether a child liked the toy and would tell it a secret, and whether a parent would buy the toy or share what their child said to it on social media.

They also watched the children, all 6 to 10 years old, playing with Hello Barbie and CogniToys Dino. The toys were chosen because they are among the industry leaders for their stated privacy measures. Hello Barbie, for example, has an extensive permissions process for parents when setting up the toy, and it has received praise for its strong encryption practices.

Most of the children participating in the study didn’t know the toys were recording their conversations. Additionally, the toys’ lifelike exteriors probably fueled the perception that they are trustworthy. Children might not be inclined to share secrets and personal information when communicating with similar tools not intended as toys, such as Siri and Alexa.

Plug in this gadget to make any room ‘smart’

“The toys are a social agent where you might feel compelled to disclose things that you wouldn’t otherwise to a computer or cell phone,” says co-lead author Maya Cakmak, an assistant professor at the Allen School. “A toy has that social exterior which might fool you into being less secure on what you tell it. We have this concern for adults, and with children, they’re even more vulnerable.”

Some kids were troubled by the idea of their conversations being recorded. When one parent explained how the child’s conversation with the doll could end up being shared widely on the computer, the child responded: “That’s pretty scary.”

‘I’ll remember everything…’

At minimum, toy designers should create a way for the devices to notify children when they are recording, researchers say. Designers could consider recording notifications that are more humanlike, such as having Hello Barbie say, “I’ll remember everything you say to me” instead of a red recording light that might not make sense to a child in that context.

Most parents expressed concerned about their child’s privacy when playing with the toys. They universally wanted parental controls such as the ability to disconnect Barbie from the internet or control the types of questions to which the toys will respond. Further, the study recommends that toy designers delete recordings after a week’s time, or give parents the ability to delete conversations permanently.

A recent study suggested that video recordings that are filtered to preserve privacy can still allow a tele-operated robot to perform useful tasks, such as organize objects on a table. This study also revealed that people are much less concerned about privacy―even for sensitive items that could reveal financial or medical information―when such filters are in place.

Speech recordings on connected toys could be similarly filtered to remove identity information and encode the content of speech in less human-interpretable formats to preserve privacy, while still allowing the toy to respond intelligibly.

“It’s inevitable that kids’ toys, as with everything else in society, will have computers in them, so it’s important to design them with security measures in mind,” says co-lead author Franziska Roesner, assistant professor at the Allen School. “I hope the security research community continues to study these specific user groups, like children, that we don’t necessarily study in-depth.”

The Consumer Privacy Rights Fund at the Rose Foundation for Communities and the Environment and UW’s Tech Policy Lab funded the work. Researchers presented their paper at the CHI 2017 Conference on Human Factors in Computing Systems.

Source: University of Washington