the arguments for keeping artificial intelligence out of therapy

content label: brief discussion of mental health stigma and suicidality

Within counselling and mental health, I’ve seen AI showing up in a number of ways, targeting both people seeking therapy and people providing mental health support. Clinicians can sometimes shy away from commenting on AI, because most of us don’t identify as “tech-oriented.” And yet it’s impossible to avoid. As I have been trying to educate myself about technology in mental healthcare, I really appreciated this recent Guardian essay on AI in healthcare by Eric Reinhart. He writes:

To resist AI optimism is often cast as anti-progress or naive luddism. But progress worth pursuing requires refusing the illusion that faster, cheaper and more standardized is the same as better. True care is not a transaction to be optimized; it is a practice and a relationship to be protected – the fragile work of listening, presence and trust.

As counsellors, part of our job is to think in nuanced ways about relationships—including therapy relationships, our relationships with technology in our own work, and our clients’ relationships with developing AI technologies.

As counsellors, we regularly encounter people who are distressed and may be utilizing Large Language Model chatbot for mental health supports. It is important to know whether this practice is likely to be helpful or harmful. This article summarizes a Stanford University study about the dangers of such interactions. The researchers found that chatbots can stigmatize mental health conditions, leaving people feeling more isolated and alone with mental health conditions. Just as concerning, LLMs can also fail to push back on dangerous suicidal thinking, putting vulnerable people who reach out for support at greater risk than before the interaction. The study gave an example of an AI chatbot expressing sympathy for bad feelings and then cheerfully guiding a person through suicidal planning activities. Cheerfully sharing information isn’t always a helpful response when someone is in crisis—that’s when people often need connection, care, and someone able to push back and help them see beyond the current moment.

But the mental health risks posed by AI aren’t only in AI-as-alternative-to-therapy. Therapists need to consider the risks of AI integration into our own practices. AI is showing up everywhere in mental healthcare. I get offers from my electronic records systems to transcribe and summarize clinical sessions for a small fee. I hear stories about ChatGPT-generated treatment plans and assessments; I’ve encountered newsletters and blog posts that seem likely to be at least partially AI-generated. Most email services have AI that suggest potential client correspondence, potentially shifting or “polishing” our tone and language in our interactions with our clients. When people are feeling rushed, it makes sense that these services seem compelling. But what do we lose?

The problems with AI in the therapy context are multiple. People have raised clear ethical concerns with the environmental costs. The amplification of social inequality and bias in generative AI has also been documented. I am also concerned about what is lost clinically in narrowing the focus to spoken language when 90% of interpersonal communication is non-verbal.

One of my biggest concern about letting robots write our notes is the impact on our clinical thinking. In the research literature, this has been referred to as “AI-induced deskilling.” While notes and treatment plans might seem like trivial “therapist homework,” they can be important spaces of communication and thinking about the work that we do. Like many therapists, I sometimes leave clinical sessions with a lot of thoughts swirling, trying to make sense of the explicit and implicit interactions in the session, measuring my own reactions, wondering about the emotions and histories living underneath the silences in my clients’ stories and statements. I don’t always know what I think until I’ve had some processing time. Most often that happens when I try to write it down, make a few notes, and identify my thoughts, responses, and questions for further exploration. Communicating to myself about the work is a crucial part of the process of therapy, and it helps me come back to client meetings with more to offer towards our collaboration.

Passing off the difficult labor of trying to capture something true and meaningful about our psychotherapy work, therapists might gain back a few hours in our weeks. But we give away an important space for thinking and feeling about the therapy relationship.

Next
Next

Grief Dreams