Share what kind of mom you are!

Get to know other mom types!

2 Ways Your Child Might Engage Socially With AI

If your child is like many children today, screens factor into their free time more than you’d like. And if you’ve dabbled with ChatGPTGemini, or any other generative AI chatbot, you know how human-like they sound with their text and voice interactions. Because of this, chatbots can be an alluring and fun way for kids to pass the time. To them, it can feel like texting a friend—it might just be how kids will use AI.

But chatbots are machines. With the prevalence of generative AI apps, you should be aware of how your child might engage socially with AI now or in the near future and the ramifications of these interactions. For children and teens, they may not be healthy. Here’s the social component of AI and what you need to know if your child starts spending time with AI to fulfill social needs.

1. Kids might use AI as a companion or friend.

With AI, it’s now possible to attempt to create the perfect friend. He’ll talk to you whenever you want and won’t leave your messages on “read” for hours. But would you want the perfect friend who’s always ready to talk? Would you allow your child to build one?

Several sites out there including Replika, Nomi, and Kindroid allow you to create free AI friends. Users customize their features and even name them. And by talking to an AI friend, the AI begins to learn about the user and tailor its responses. The result is very similar to chatting with a real human being. For younger users, Character.AI is a popular choice and has a large teenage base. “[C]ompanionship apps are one of the fastest-growing parts of the A.I. industry,” says tech journalist Kevin Roose. “Some users will scoff at befriending a chatbot. But others, especially people for whom socializing is hard or unappealing, will invite A.I.s into the innermost parts of their lives.” For some kids, talking with an AI could be very appealing and helpful. But is it healthy?

Is an AI friend a good option?

Your child might create an AI friend, Roose says, which “won’t be a gimmick, a game or a sign of mental illness. It will feel to them like a real, important relationship, one that offers a convincing replica of empathy and understanding and that, in some cases, feels just as good as the real thing.” But parents should know that even if an AI friend can provide support for their child, it’s still a machine with no emotions or feelings. It’s simply a bot trained on vast quantities of data from the internet. It’s not going to have the same emotional responses of a human friend, and it’ll likely only tell you what you want to hear. Your kids need to know this too.

But because it’s a booming field and our children are frequently on screens, they may stumble across an AI friend site at some point. Creating an AI friend, however, shouldn’t be a substitute for real social interactions. MIT professor Sherry Turkle says talking to a machine doesn’t “develop the muscles—the emotional muscles—needed to have real dialogue with real people.” Chatting with an AI friend might help some kids develop stronger social skills as one teenage user points out in this article in The Verge, but apps like Character.AI can become addicting and blur the lines between fantasy and reality.

AI romantic friends are also appealing to some kids and teens.

Parents should also be aware that companion apps like Character.AI have options for users to create romantic friends while specific apps for AI girlfriends or boyfriends also exist. With both, content can quickly become X-rated.

Rather than having your kids turn to machines for their primary source of support, let yours know you’re available whenever they want to talk. Check in often with how they’re doing and spend quality time together to build a strong basis of trust.

2. Kids may use AI as a therapist.

While a search engine can call up a list of sources for a user to sift through, generative AI changes the game. Chatbots interact with users in a back-and-forth manner, much like talking to a human, using the sources to inform its responses. Because of this, it’s no surprise that people have turned to chatbots for mental health support. While ChatGPT will provide human-sounding responses to inquiries, there are other chatbots trained specifically to respond like a therapist. Researchers have found that therapeutic chatbots being developed have been trained to “boost well-being, using CBT, mindfulness, and behavioral reinforcement activities,” and “there can be positive mental health outcomes” from using them.

But is an AI therapist right for your child?

More kids and teens today are grappling with mental health issues like depression, anxiety, and loneliness than they did a few years ago. Because of that, there’s often a long wait if a kid needs to see a licensed therapist. But with generative AI, mental health chatbots offer therapy without a wait—and they’re available 24/7. For some, this could be the support they need while waiting to see a professional in person. And for those wishing to avoid the stigma of seeing a real therapist at all, a bot can fill that need.

A popular AI bot on Character.AI called Psychologist has been messaged millions of times since it was created less than two years ago. And it’s “by far the most popular mental health [bot], with many users sharing glowing reviews on [the] social media site Reddit,” according to the BBC. Tech writer Jessica Lucas interviewed a handful of teenage users who found the Psychologist bot “helpful, entertaining, even supportive.” Having someone to talk to and listen without judgment is one of the reasons many kids find chatbots useful.

Dr. Kelly J. Merrill Jr. who studies the mental and social health benefits of communication technologies says, “The research shows that chatbots can aid in lessening feelings of depression, anxiety, and even stress. But it’s important to note that many of these chatbots have not been around for long periods of time, and they are limited in what they can do. Right now, they still get a lot of things wrong. [People who] don’t have the AI literacy to understand the limitations of these systems will ultimately pay the price.”

Apps can be an easy solution. But are they the right choice?

Because there are many apps now available for free at a basic level that can serve as a therapist or friend, a child who’s lonely or had friendship issues might find them very appealing. But my guess is you’d rather be the one your child turns to when they’re hurting or struggling. The younger and less mature the user, the more they may think there’s a real, caring person talking to them through these apps. It’s territory you may not want your child to delve into, especially if he has more serious mental health concerns.

For some children facing challenges, these apps could help—especially if you’re in the process of seeking a human therapist but haven’t yet found the right one. But be skeptical and vigilant, and always try to be your child’s first option when he or she needs to share feelings. Ask your kids about their day, stay current on your kids’ real-life friends, and spend time with your kids every day. Knowing you’re there to listen without judgment may make your child’s decision to turn to you for support, rather than a machine, a lot easier.

We don’t always know how kids will use AI, but we need to be prepared. What AI-related discussions have you had with your kids so far?

ASK YOUR CHILD...

Why is it important to have face-to-face conversations with people instead of just chatting online?

Get daily motherhood

ideas, insight, &inspiration

to your inbox!

Search