×

Can AI Replace Your Therapist? Why Teens Are Turning to ChatGPT for Mental Health Support


Can AI Replace Your Therapist? Why Teens Are Turning to ChatGPT for Mental Health Support


MART  PRODUCTIONMART PRODUCTION on Pexels

You just had an argument with a friend. You're fairly sure you're in the right and they're in the wrong, and yet, as you think about it more and more, something else starts to gnaw away at you. This, after all, isn't the only argument you've had recently; you had a falling out with another friend months prior, and it seems like all your relationships have been crumbling. You feel defeated, tired, desolate. But who can you talk to about this that will understand your perspective as it is, and you can remain largely anonymous? The answer, apparently, is ChatGPT.

Alarmingly, a recent study by Common Sense Media has shown that an estimated 5.2 million adolescents use AI chatbots for mental health advice—that's one in eight teens and young adults. But why are youth turning to AI for companionship and support? And can ChatGPT really replace a human therapist? 

AI Is Accessible

One of the main reasons that researchers believe why young adults are turning to AI chatbots, like ChatGPT, for support is simply because they're accessible. They're mostly free to use, or at least more affordable than traditional therapy and counselling. And another thing: replies are instantaneous. 

You could message an AI companion at any time of day, for any kind of reason. You can share in-depth perspectives, snippets of conversations, detailed contexts. You can talk to it however you want, and have it respond back as if it were a real friend. For many teens, this is enough to build a sense of trust and partnership. It's no wonder why an overwhelming percentage of the youth population has been relying more and more on AI to help them through personal conflicts.

It's a Judgment-Free Zone

Matheus BertelliMatheus Bertelli on Pexels

Another reason why teens and young adults are entrusting AI chatbots with their personal issues rather than talking them out with real friends and family is for privacy reasons. Despite needing to be logged into an account to chat with most AI companions, many adolescents believe that they can remain fairly anonymous, and that belief makes them braver.

Given that, they're more likely to share intimate stories, experiences, and information. On the other hand, chatting with a real person or therapist, no matter how confidential, still means they'll need to confide in a person who will match a name to a face. This person will then be privy to all their struggles, relationships, and internal—and external—conflicts. Something about that can make them feel vulnerable and exposed, and chatting with AI means they can avoid this discomfort. So they take what they consider to be the "safer" route.

Can AI Truly Replace Human Therapists?

Of course, AI chatbots, like ChatGPT, can't replace human therapists. The latter are licensed mental health professionals who devise and offer methods and services that AI can't replicate, and provide helpful guidance that's tailored to each person. They also know how to monitor, diagnose, and administer treatment plans that ensure the safety and overall well-being of the individual. AI, on the other hand, doesn't recognize when to check in or intervene or suggest further support. AI also can't offer real empathy, nor can it understand emotional nuance.

The scary thing about AI is that, even with guardrails implemented, they can easily be sidestepped, allowing teens and young adults to ask it about harmful subjects. There's also the problem with privacy. While it might feel more comfortable to share intimate thoughts and experiences with a chatbot than with a real person, licenses may not keep this information secure and confidential. As noted by Common Sense Media in their study, "The nature of these licenses means that personal information shared by teens—including intimate thoughts, struggles, or personally identifiable information—can be retained, modified, and commercialized indefinitely, even if teens later delete their accounts or change their minds about sharing."

All of this goes to show that, no matter how trusted AI companions are among younger users, companies need to do their part to ensure that generative AI platforms don't end up doing more harm than good.