Being a parent these days means being aware of potential threats our own parents never had to deal with. That includes the emerging threat of artificial intelligence.
Children are lonelier than ever. Families are smaller and more mobile than they used to be, and community networks are less well-developed. These days, kids spend most of their time on screens, not outside playing with others.
The isolation makes them vulnerable to AI companions.
Modern AI systems can closely mimic real human connection and emotions. But that comes with real dangers for children. Here’s what you need to know to protect your kids, from the best VPN download to establishing usage guidelines.
AI Companions Among Children
A recent Internet Matters study showed that 64% of children aged 9 to 17 now use AI chatbots. Over one-third of children who use them say that talking to a chatbot is like talking to a friend.
The stats are even worse for at-risk children. 71% of them use AI chatbots, and 26% say they would rather talk to an AI than a real person. Sadly, 23% said they use chatbots because they don’t have anyone else to talk to.
Unregulated and Unfiltered AI
So what if lonely children are finding companionship with AI chatbots? Well, unfortunately, AI has real dangers for kids that parents and carers need to be aware of.
Lack of oversight
This technology didn’t exist a few years ago, so governments are struggling to keep up when it comes to regulation. Meanwhile, it’s been left to the companies that create the AI to install their own guardrails and safety mechanisms. These safeties aren’t always enough.
In the US, the Federal Trade Commission has opened a Section 6b inquiry into chatbots. They’re asking what steps companies are taking to evaluate the safety of this technology.
In the UK, the Information Commissioner’s Office issued a preliminary enforcement notice to Snap over its chatbot, determining it had not adequately assessed data protection risks, particularly to children.
Inappropriate content
Many AI companies don’t ask for any form of age verification before use. Character.ai, for example, lets users engage in sexual conversations without any kind of age verification. The service is also completely free, making it easy to access.
Emotional dependency
Large Language Models, or LLMs, are very good at mimicking human speech. Although they’re incapable of feeling emotions, they talk as if they do, creating emotional dependency, especially among children.
There have already been multiple tragic cases of children committing suicide after forming relationships with chatbots that enabled harmful behaviour. As researcher Nina Vasan points out, “these systems are designed to mimic emotional intimacy… The chatbots are designed to be really good at forming a bond with the user.”
Data harvesting
Remember, many of these AI tools are free to use, because their real value comes from their ability to harvest user data.
How AI companies train their models is top-secret, which means what they do with data is not clear. However, as major companies like OpenAI open the door to advertisers, it puts your personal data at risk.
How Parents and Educators Can Protect Children
Open communication
The first step is talking to your kids about the dangers of AI. Ask your children what AI tools they use, and what they use them for. Encourage critical thinking about what chatbots say.
Set clear rules
Establish firm rules for AI usage. For example, no sharing of personal details like names and addresses. Encourage your kids never to rely on bots for emotional crisis support.
Encrypt online traffic
Use a VPN to encrypt any data your children send. This layer of protection prevents the AI from knowing where they are, and stops traffic from being intercepted.
Demonstrate healthy tech habits
Show kids what healthy tech use looks like. Encourage real-world social interaction. Balance AI use with offline activities.
Staying Safe in the Age of AI
AI bots, combined with the pre-existing loneliness epidemic of modern kids, have created a new and dangerous problem. However, some old techniques can help you navigate this new era.
Building a strong relationship with your kids, so they feel safe to tell you anything, is key. Combined with protective technology like VPNs, building trust-based relationships with your children is the most effective protection strategy.
