ChatGPT for Therapy? What A Human Therapist Wants You to Consider
AI is everywhere. And the therapy world is no different. Increasingly people are turning to AI software, like ChatGPT, for “therapy”. I put therapy in quotation marks, because simply responding to questions with lists of coping strategies or potential solutions isn’t really my idea of effective therapy. However, that doesn’t mean that AI can’t be helpful! What it does mean is that we need to be wise about how and why we are using AI in this way. Here are some things I’d recommend you consider if you’re using AI for therapeutic benefits.
1. What am I looking for?
AI is great for all kinds of things and works to help people develop tools and strategies for how to address certain issues. In this way, I see AI essentially as an interactive self-help book. And if this is all you need- great! But if you want someone to be able to hold space for you, listen and validate your experience, and offer new ideas that may not already be on the internet, this is where AI will fall short.
2. AI isn’t always accurate.
AI is pretty smart, but it’s not an expert on psychology or therapy. Some people have had experiences with AI offering them less than helpful guidance. For example, recently an organization that assists people with eating disorders had to discontinue its AI program after it told clients to start counting calories (that’s a no-no for ED treatment!). AI may not always give you the best information that you need for your health, and it certainly doesn’t know how to set boundaries or redirect you if your line of questioning isn’t healthy.
3. Therapy is more than coping strategies and finding solutions.
What really makes therapy effective isn’t the tools or skills that your therapist recommends. It’s not the worksheets or specific interventions. What makes therapy effective is your RELATIONSHIP with your therapist. Being seen and understood by another person is a deeply healing experience, one that requires a therapist to be present, pay attention, and hold space for your thoughts and feelings. This is something that AI just can’t do. No matter how fancy it may seem, you will still know that you’re talking with a computer and not a person.
4. Confidentiality and Crisis Assistance
What is happening with your ChatGPT chats? Do you know? That data is being stored somewhere, but because ChatGPT and other AI programs aren’t health-care providers, they aren’t beholden to the same privacy laws as your human therapist. I can’t tell others what you say in session, but with AI, we don’t have those protections. This puts a lot of people, especially vulnerable populations, at risk. What happens when ChatGPT gets hacked and your user data gets exposed? With human providers, you have recourse. With ChatGPT you don’t.
Additionally, how does ChatGPT ensure that people are safe? If you tell an AI program that you’re suicidal, is it going to follow up with you the next day? Is it going to make sure you have a safety plan? If a teenager tells ChatGPT that they are being abused, will the AI contact child services, the child’s parents, or the police to help protect this child? The answer to these questions is no. But your human therapist? They are mandated by the government to keep you safe and if we don’t, we are held accountable for that. Who is ChatGPT accountable to?
I’m all for people having open access to tools and resources to support their mental health. AND I am a firm believer that when it comes to therapy, there is nothing as good as the real thing. If you’re using AI as a therapy tool, be wise and use it with good judgement! And if you’re looking for a human therapist- I am one! Click here to schedule your free consultation call.