My Therapist, ChatGPT
The risks of using AI for mental health treatment
How many times have you heard a friend say, “Oh, I asked Chat, and got a great answer!” I for sure, have heard many, and even some clients tell me that they’ve used AI in between sessions, or asked it mental health questions. ChatGPT is an artificial intelligence (AI) chatbot designed to mimic human language. ChatGPT, or Chat as many call it, can answer questions, write code, create pictures, and generate content. Chat works by analyzing data to predictively generate text, leading to something that feels real, and genuine.
It even has a voice that it uses, making it seem like your friend (or therapist). It’s also accessible and immediate. You type in a question, or a phrase, and you have the immediate gratification of getting a response. You don’t have to wait for a text or call back. You don’t have to be vulnerable for a stranger. All of this can allow you to think that it has the ability to solve problems, when really all it can do is give you an answer based on the information it has.
With the wam voice and the ability to feel like it can solve your problems, it’s no wonder that more people are turning to Chat and other AI models for mental health treatment. However, this comes with multiple risks and concerns.
Chat doesn’t actually know you
Chat works by taking your words, scanning its incredibly large database, and giving you an answer based on what you are saying. In other words, even though it can sound empathetic and offer insight, it’s ultimately a machine. Chat does not pick up on any shifts in your tone, or your body language, or what you aren’t saying.
As a therapist, I’m not just listening to what you say. I’m seeing if your words match up with your body lanaguage. I’m looking for any hesitation you may have, or if something makes you uncomfortable. I look to see how many times you bring something up, or if there is anything you avoid talking about. And most importantly, I get to know the true you. I want to know your likes, your interests, your hobbies, the important people in your life. I’m going to ask all of the questions, even ones that may make you uncomfortable. The biggest predictor of success in therapy is the therapeutic relationship, and I work hard to build that with each and every client.
It validates without challenging
Chat is fantastic at validating you. It will tell you exactly what it thinks you want to hear. Go on, I encourage you to try it. See what it says. Just for this article, I tried telling chat I had an argument with my partner. Chat immediately validatded my feelings, telling me that it “makes sense that it’s sticking with you. You don’t have to solve it right now. Tonight might just be about getting through.”
Okay! Wow, Chat this sounds great. But looking at it a little deeper, other than validating me, what has it done? It hasn’t given me any conflict resolution skills, it hasn’t done a deep dive into how or why the argument happened. It didn’t even question if maybe I could have said or done something differently. Instead, it told me how right my feelings were without any other step.
As a therapist, I’m not here just to validate you, or tell you how right you are. If that’s all I did, it wouldn’t encourage growth or change. My job is to help you see your struggles from a new perspective. Chat just can’t do that.
Chat lacks clinical judgment
In order to get my degree, it took two years taking 60 credits that covered multiple aspects of mental health. I had two internships that were hundreds of hours in two different settings. Once I graduated, I had to have 3000 supervised clinical hours before I could independently practice. At a minimum, this took 5 years. It was another year before I started practicing as a therapist. I’ve worked with hundreds of people, face to face. Chat does not have this experience or knowledge to be dealing with the complexities that lie within mental health.
Something I’ve noticed is that more and more clients will ask Chat “Do I have depression, anxiety, ADHD, Autism, etc,” give some symptoms they are experiencing, and since Chat is so validating, it tells them yes or says it’s highly likely. Chat is not actually looking at what your symptoms are or how they compare to others. Chat cannot tell if your symptoms are actually clinically relevant, while a therapist can.
Chat is not trained to look at the risk or how to appropriately respond to crisis situations. It may not recognize when something requires more support. Meanwhile, with all of the training I’ve had, I can tell when something is a low, medium, or high risk (I do risk assessments every session). I look for signs of depression or anxiety increasing, and any factors for self-harm. I’m able to develop safety plans with you and help with crisis intervention. AI may give you the surface level response, but it doesn’t actually have the knowledge or understanding of what is happening.
Final Thoughts
If you’ve been using AI as a therapist, this post isn’t to judge or shame you. As I said in the beginning, Chat is immediate, accessible, validating, and there for you. The dangers come when we as a society rely on it for mental health treatment or think it’s the same as a therapist.
You deserve more than AI. You deserve to be known, and understood. You deserve someone who can support you, where ever you are.
If you are feeling overwhelmed, stuck in your thoughts, or trying to hold it all together, therapy can give you a space that goes deeper than quick answers and instant gratification. Therapy can bring you the peace you deserve.