Today, kids have the chance to interact with their stuffed animals, robots, or dolls in ways their parents were only able to dream of. These toys, usually referred to as “smart” or “connected,” have built-in motion sensors, speakers, and microphones that allow them to analyze what children say and respond within seconds by searching an online database or the internet at large for an appropriate response. They learn children’s preferences and interests over time, so their play can become personalized, which may improve communication skills and has been found to for children with intellectual disabilities. But these toys also open up the internet to children as young as three, creating a new digital frontier that parents and caregivers need to research, understand, and patrol.
While you might have firm rules about what you share on social media, it’s harder to perceive smart toys as a potential threat, says Sophie Linington, deputy CEO of Parent Zone, a social enterprise that helps families safely navigate the internet. “You get lulled into a false sense of security, thinking, ‘Oh, it’s a cute teddy bear.’ But if it connects to the internet then the same kind of thinking needs to be done before you hand one over as with a tablet or a phone.”
You should be prepared to keep track of any recall notices and security upgrades for the life of the toy.
Smart toys can be hacked into and parents should also be aware that any information they collect may not be private. If a smart toy or game communicates with a child — whether by text or by “speaking” to them — those messages or recordings will be transmitted to an external database so they can be analyzed and responded to, and they will likely be stored so the toy can give the impression of having learned information about its owner. How that data is stored, whether it is encrypted and how secure the passwords that protect it are (if they exist at all) are details companies don’t typically volunteer, and this is such new territory for parents that most may not to think to ask.
In the last three years, a series of vulnerabilities has been uncovered. CloudPets, furry toys that allowed children to send and receive audio messages, were pulled from sale after security experts found their online storage system , which led to 820,000 records (including children’s names, ages, and voice recordings) .
That followed news that talking doll My Friend Cayla, banned in Germany as an illegal spy apparatus, contained an , which meant anyone within range could listen in. In 2015, hackers struck Hong Kong-based company VTech, which makes a range of connected toys, including cameras, and of over 6.3 million customers, including children’s photos and home addresses. Last summer, the FBI issued a about the importance of smart toy security.
To be clear, there’s no evidence that information from a smart toy has been used to target any child, either online or in real life. But keeping children safe will be more of a challenge as the market continues to expand—digital forecasters Juniper Research predict it will be .
Linington recommends reading independent reviews, particularly relating to a toy’s security protections. If you decide to buy one, before giving it to a child, take it out of the box, change any default passwords, and disable those features that aren’t necessary to its use (perhaps a camera or GPS tracker). You should also be prepared to keep track of any recall notices and security upgrades for the life of the toy.
“Is privacy going to become a luxury feature? That would be a really bad outcome,”
Ashley Boyd, Mozilla’s vice president of advocacy and one of the creators of the guide, says that in several cases the issue wasn’t a confirmed vulnerability but a lack of information, which makes it impossible for customers to make an informed choice. When information is provided, it is too often difficult to interpret. Boyd is also concerned that the most expensive products are usually the ones with the best security. “Is privacy going to become a luxury feature? That would be a really bad outcome,” she says.
A new feature of the guide is a “creep-o-meter,” which lets readers select from a series of increasingly distressed emoji to represent how intrusive they find a specific product. It’s a simple way to send a message to manufacturers, something Boyd thinks we could do more of by, for example, using online customer service systems to ask about security features.
Ultimately, though, she’d like to see companies be more proactive about safety and data protection. Alexandra Ross, founder of and Director of Global Privacy and Data Security Counsel at Autodesk says things are slowly moving in the right direction. The introduction of General Data Protection Regulation (GDPR) legislation in Europe earlier this year and coverage of high-profile breaches have brought the issue to the fore. “Changes will continue to happen as toy companies realize that to meet customer expectations, they need to build privacy and security into their products.”
In the meantime, Ross doesn’t think customers should be deterred. “There is value to some of these smart toys. There’s educational value, there’s certainly social value and some of them are very entertaining.” The risks involved can often be mitigated, as long as consumers do their research and know what precautions to take. “Unfortunately at this stage parents are taking on risks they’re not aware of,” says Boyd. “That’s the gap we’re trying to close.”