Many children across the globe received one or more smart toys during the holiday season. These toys are popping up everywhere and are on many children’s must-have list. Parents, grandparents, uncles, and aunts will have given in to the pressure and added a smart toy to a child’s much-loved collection. Also across the globe, an influx of potential vulnerability points is popping up that hackers are getting ready to exploit. They want to steal your data! This leaves us with some happy children and happy hackers but not so happy parents.
You do not need to look too deep to find a company or toy that has been responsible for a data breach of children’s personal information. Some notable ones: VTech has been fined substantially for its breach of children’s information, CloudPets, marketed as “a message you can hug,” and My Friend Cayla doll. The list is only expanding, especially as smart toys grow in popularity.
This is not a surprise, but what is quite surprising is how readily parents or guardians give these toys to their children with not much thought. Especially since most parents are overly protective to ensure that their children are safe in all other regards. Why is this any different? Many parents do not realize the impact a smart toy can have on their and their child’s privacy and how dangerous they can actually be. The teddy that responds, the robot that interacts, the duck that reads your child’s emotions — do parents stop to think how it is that they are able to accomplish these things and at what cost?
Looking at some of the big recent breaches highlights the need for education around smart toys. Some of them are vulnerable to hacks, but some do not even need to be hacked to cause a problem — all that is required is an improperly configured database. And by all means, if you want your kids to enjoy them, just make sure that you and your child are educated with regards to what the toys can do.
To do what they do and respond as they do, these toys typically need data and they need to connect to the Internet. As your child is playing with them, the toys are using the data (and other data that they “see” around them within their location). They are recording, processing, and storing this data in the cloud. Do you know what this data is or what it is being used for?
What’s behind the soft, cuddly exterior?
In 2017, hundreds of smart toys were released. These toys have their place but in some cases, users are trading their privacy for the use or subscription to the toys ecosystems. Moreover, these toys are not always built safely. Many of these toys are built by hardware manufacturers that produce hardware to specified standards so that it’s simple to use them with software APIs.
These hardware manufacturers produce the modules at mass and the different developers put the APIs and software together to achieve a differentiator by adding the “smarts.” This intersection of hardware and software is fantastic and exciting. However, it has its weaknesses, too.
Some of these toys have been hacked and have known vulnerabilities that have been exploited and used against its users. We hear of users who have had their information obtained from a livestream and the attacker has tried to use the video to blackmail them. This is a breach of privacy, just like someone using a CCTV camera to spy on the user without permission. All of this means that some of these devices are not secure by default.
Many of these toys carry on conversations and this requires a microphone. These could be listening to other conversations and not necessarily only your child’s. This is often transmitted to a server in the cloud.
Several attacks are possible. You can easily attack these toys if you have physical access, obviously. But there is also an attack vector when the data is in transit, for example, toys that stream information like video, voice, and live information. The final potential place of attack is the data that gets stored and processed in the cloud. The data can then be modeled and processed and mined to get value for the company — or for hackers.
These technologies that “help” our children could be used in a way that counters that. Smart watches for kids, meant to help track the child and give parents peace of mind, could be accessed and used by the bad buys to do the same and used to get in contact with the child.
GDPR, smart toys, and your children’s privacy
Many of these devices are not as regulated as they should be. Specifically, the services that they offer have been unregulated from a privacy perspective in part until recently.
With the advent of GDPR, developers of these devices and services better sit up and listen. Users’ privacy is very important and if the rights and freedoms of their users (especially in Europe) are not respected they can expect fines and retribution by the supervisory authority. Particularly if the data through the smart toys results in special category information, like information about children. In short, tracking the user of these toys and mishandling the data collected while the child is using the toy and also the retention of the data is not allowed without explicit consent from the child’s parents or guardians.
Some of these toys are set up to gather data on the user by default and without user permission. And the child or the parents usually do not understand what data is being collected and how it’s going to be used. The manufactures and developers forget that this information does not belong to them nor do they have any right to collect the information or store it without the user giving appropriate permission to the company. Ignoring the rights of the users in the EU is against the GDPR, and after May 25, when the regulation takes effect, there is a clear way to report such transgressions.
A default setting of collecting information is deemed not to be secure as it breaks one of the three pillars of security, specifically confidentiality. The user’s data is obtained and their confidentiality is not maintained as unauthorized individuals and companies may have access to their data. The user not being aware is a serious issue as in some cases it has been found that some smart toys that have cameras in the toy will record and store video of users in the cloud and stream video. This is cause for privacy concern issues.
There is nothing wrong in using responsible and secure technology in a smart toy. But parents should research and understand what they are bringing into their home and what it means for your children.
If you use any of this technology, just take the time to think creatively about how the data collected could be used and if you are comfortable with others having access to this type of information. In many cases, people say they don’t mind until there is a breach and the data is used against them in some way they never imagined possible.
You need to understand the security and privacy risks as you understand the risk when letting your child cross a road on her own. You teach them how to do this safely and have precautions in place — for example, “look left, look right, and look left again.” The same precautions need to be in place when giving your child a smart toy to use and interact with.
Things to consider
- Only connect and use smart toys in environments with trusted and secured WiFi Internet access.
- Closely monitor children’s activity with the toys (such as conversations and voice recordings) through the toy’s partner parent application, if such features are available.
- Check what personal data the toy is collecting and for what purpose (the more sensitive the data the higher the risk of breach and negative impact on you and your child).
- Carefully read disclosures and privacy policies and check how the personal data is processed and if it is being sent to third parties for processing.
- Check the default settings and see if you are able to change the privacy settings.
- Check whether the toy is always on (connected) or is there an offline option. Otherwise, the toy may be monitoring more than you are aware of and all the time (listening to you and watching you as you go about your business at home).
- Take care when clicking yes to everything you sign up for and avoid companies that give you no choice on your own data. It’s your data, you should control it.
It may be difficult for some parents to believe that by their child simply playing with toys such consequences are possible, but the risk is there. Data breaches are happening, and children’s (and parents’) sensitive data is being leaked and being used inappropriately. Photos, video, and voice data are all vulnerable. These toys are designed to monitor locations and sometimes the health of children and to analyze personal qualities and interact through conversation with children. They are meant to have a positive impact on the well-being of our kids, but there is a scary side that we need to do our best to avert so that we are able to protect our privacy and that of our children, too.
Photo credit: CloudPets