AI in Therapy: The Unseen Impact on Client Privacy and Trust (2026)

Imagine discovering your therapist has been using AI to analyze your sessions without your knowledge. It’s not just a breach of trust—it’s a violation of privacy. This is exactly what happened to Molly Quinn, a 31-year-old librarian who had been in therapy for two years without any issues—until her therapist introduced a new note-taking app powered by AI. But here’s where it gets controversial: while the therapist claimed the app would delete session data after transcription, Molly wasn’t convinced. She felt her private conversations were being stored somewhere in the AI cloud, a thought that left her deeply uncomfortable. And this is the part most people miss—halfway through the session, Molly noticed her therapist wasn’t taking handwritten notes. Instead, she was recording their conversation on an iPad, using the AI app without Molly’s explicit consent. ‘It left me feeling violated,’ Molly shared. She confronted her therapist, who admitted to using the app but promised not to do it again. Yet, the damage was done. Molly’s trust was shattered, and she never returned.

Molly’s story isn’t an isolated incident. The National Alliance of Counsellors and Psychotherapists (NACP) in the UK has received numerous complaints this year from clients who discovered their therapists were using AI—whether for note-taking, transcribing sessions, or even drafting emails. ‘There are certainly therapists experimenting with AI, often without properly disclosing it to their clients,’ says Meg Moss, head of public affairs and advocacy at the NACP. ‘Clients are understandably uncomfortable with this lack of transparency.’ Reddit threads are filled with similar stories, like one client who was shocked during a Zoom session when their therapist accidentally shared their screen, revealing they were using ChatGPT to formulate responses in real-time.

Photographer Brendan Keen shared a similar experience on Medium. After a successful video session with his BetterHelp therapist, he used the platform’s text chat feature and received a reply that felt eerily familiar. Upon confronting his therapist, she admitted to ‘referring’ to AI without disclosing it. Keen felt betrayed, questioning the confidentiality of his sessions. ‘It’s not just about the technology,’ he wrote. ‘It’s about the trust between therapist and client being compromised.’

But here’s the bigger question: Is AI in therapy ethical, and where do we draw the line? Therapists like Ranjith Devakumar argue that AI is a valuable tool. He uses platforms like ChatGPT and Microsoft Copilot to brainstorm ideas or research techniques, but he’s careful never to input client-specific information. ‘It’s a soundboard, not a replacement for human judgment,’ he explains. However, not all therapists are as cautious. Ruby Mitchell, an NCPS-accredited therapist, admits she’s been tempted to use AI for quick feedback after challenging sessions but resists, knowing the risks. ‘If something goes wrong, and you’ve relied on AI, where does that leave you in front of a complaints panel?’ she asks.

Richard Miller, an ethics consultant with the British Association for Counselling and Psychotherapy (BACP), is working on an ethical framework to guide therapists on AI use. ‘Your client’s story belongs to them, not to a server,’ he emphasizes. ‘We need the highest standards of confidentiality, and until we can guarantee that, we shouldn’t rush to adopt this technology.’* And this is the part most people miss:** AI isn’t just a tool—it’s a third party in the room, with its own biases and limitations. Trained on Western data, it may struggle to understand clients from diverse cultural backgrounds, potentially doing more harm than good.

Here’s the controversial part: While some therapists see AI as a way to streamline their work, others argue it’s unnecessary. ‘Therapists have been doing their job for years without AI,’ Moss points out. ‘Why risk complicating the therapist-client relationship?’ She warns that clients may feel uncertain if they know AI is involved, questioning whether their notes or responses are genuinely human. ‘Unless it’s absolutely necessary, I’d caution against using it,’ she advises.

So, what do you think? Is AI a helpful assistant or a dangerous intrusion in therapy? Should therapists be required to disclose its use, or is it their prerogative to decide? Let us know in the comments—this is a conversation that’s just beginning, and your voice matters.

AI in Therapy: The Unseen Impact on Client Privacy and Trust (2026)

References

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Jamar Nader

Last Updated:

Views: 6499

Rating: 4.4 / 5 (75 voted)

Reviews: 90% of readers found this page helpful

Author information

Name: Jamar Nader

Birthday: 1995-02-28

Address: Apt. 536 6162 Reichel Greens, Port Zackaryside, CT 22682-9804

Phone: +9958384818317

Job: IT Representative

Hobby: Scrapbooking, Hiking, Hunting, Kite flying, Blacksmithing, Video gaming, Foraging

Introduction: My name is Jamar Nader, I am a fine, shiny, colorful, bright, nice, perfect, curious person who loves writing and wants to share my knowledge and understanding with you.