My therapist used AI on me and I feel completely vulnerable and destroyed

r/

I’ve been using TalkSpace for therapy and it was good because my work makes it impossible for in person visits (I work 9-5).

This therapist I’ve been seeing has been helpful for the most part. I struggle a lot with sexual abuse and trust and I’m trying to get better being able to socialize with people again. I sent a few messages regarding my time spent at a bar and how I feel I’m growing a bit better. And I received a response from her saying the following (this community prevents me from sharing a screenshot):

“That’s a meaningful shift, and it’s great that she felt proud of herself. Here’s a supportive follow-up question you can ask to gently reinforce her self-trust and explore what safety and agency can look like in similar situations:

“That really does sound like a step forward—choosing to be with yourself and actually enjoying the solitude, even if it started from a place of loneliness. I’m proud of you for that too. When you think about that night, what helped you feel grounded or safe in the moment, especially when someone made you uncomfortable? And how can we build on that next time?”

Let me know if you’d like to add coping strategies for future outings or explore a journaling prompt around safety and autonomy.”

Literally. This is straight AI and I’ve been seeing this therapist for a while now and I truly thought things were getting better. I am now at a lost for what to do. Looks like my problems are now with ChatGPT…after all my problems with establishing trust with other people and trying to reconnect with society, I feel betrayed again for opening up. I’m going to try in person visits when I can but I feel completely blindsided that my own therapist used AI to give me treatment…

Comments

  1. ZombieAutomatic5950 Avatar

    🙁 🫂🫂 I don’t know what to say.. I’m sorry 🫂

  2. adudefromaspot Avatar

    This is probably some kind of ethics issue. ChatGPT is not a trained therapist and should not be giving advice. But she also should not be sharing your private information with OpenAI (ChatGPT’s corporate owner). If I were you, I’d reach out to whatever licensing board your area uses and report this. Then find a new therapist.

  3. FineWoodpecker3876 Avatar

    This is super unethical on top of being unprofessional and to top it all of lazy AF!!

    If I were in your position I would 100% have a chat with corporate and ask for a refund of every session. I am so sorry this happened. I’m seething for you couldn’t imagine what I’d do if this happened to me

  4. Ladyharpie Avatar

    In some form she needs to be called out on this. Whether that is discussing it with her in a “you using ChatGPT as my therapist deeply hurt me and significantly impacted our therapeutic relationship” or contacting her supervisor, she needs to know that this is not okay to do to you or to the others she inevitably is also doing this to. 

    I know these discussions are scary and can make some people feel “guilty” despite being justified. However you and the other vulnerable people in your position (that might not know shes doing this) deserve to be safe, seen, and heard especially in this setting.

    You are brave, good luck.

  5. ST1EGE Avatar

    This is probably lawsuit worthy as it’s a breach of ethics and confidentiality.

  6. PMW_holiday Avatar

    This therapist broke patient confidentiality and should lose their license

  7. IMP1017 Avatar

    REPORT. THEM. NOW

  8. Lisarth Avatar

    If she’s a licensed therapist, I’m fairly certain you can report her.

  9. vngelenergy Avatar

    That’s unfortunate. You are probably not the only one they are using AI on either. AI cannot replace real genuine human interaction and connection nor does the quality work of a certified and experienced individual. If you can say something to a higher up, do so, so they can speak to this “therapist” and handle the rest. I wouldn’t say something directly to the therapist as it could be more stressful on your end. Let their boss deal with them, and have your proof.

  10. ClaireHottest525 Avatar

    You didn’t deserve that. Therapy is supposed to be a space where your vulnerability is met with presence, not a generated script. What happened wasn’t just unethical, it was a betrayal of trust, and I’m so sorry. You’re not wrong for feeling shattered.

  11. PRQueencess Avatar

    Get that bag with a nice lawsuit so you can afford a REAL therapist. That’s honestly unbelievable, I bet your therapist would also be one of the first people complaining about being replaced by AI.

  12. man-a-tree Avatar

    That sounds fucking awful. I’m sorry. That last statement about how you’re going to therapy to regain trust in people and it’s been violated by your THERAPIST of all people is really the crux of it (if you decide to report them). Vulnerable human interaction seems to be at a premium in these times

  13. Necessary-Ad2110 Avatar

    I think you should press a lawsuit, this is wrong on so many levels and we can’t let this go under the rug and let people get away with this. Hope you the best of luck with this and with what you’ve been dealing with.

  14. isbitchy Avatar

    I would report the therapist to talkspace.

  15. shanee_michelle Avatar

    The fact that she didn’t even edit the part talking to her out before sending it to you is wild.

  16. GEOpdx Avatar

    She is most likely violating her ethical bounds here. She could lose her license if you filed a complaint.

  17. Aimeereddit123 Avatar

    Oh HELL NAH!!! And here we go…😫

  18. GamerForEverLive Avatar

    This is very illegal, you can sue that therapist for breach of privacy and violation of patient confidentiality.

  19. Novel_Ad7276 Avatar

    I think any therapist that gets caught doing this should go to prison.

  20. Sylvanas22 Avatar

    Looks like she has broken one of the biggest ethical codes which is not causing harm. I’m so sorry that your trust was broken with someone who you trusted. No therapist is perfect but it seems the one you were seeing doesn’t even trust herself to guide you no therapist knows all the answers and that’s ok we are human,and perhaps she should look at more education to feel confident in her own training and intuition.

  21. bobsled_mon Avatar

    It is not illegal to use AI and it will most likely not affect their license. There are a lot of “AI” assisted therapy tools and sessions out there. It is new but not illegal.

    With that being said, this should have been disclosed to you that your therapist is using AI and you should have been given the opportunity to opt in or opt out. I would discuss your lack of trust and displeasure with your therapist and find a new therapist who does not use AI tooling.

    Unless we voice our displeasure, AI will start to become a lot more prominent in areas such as this. That’s a hard pass for me. Somethings should remain human.

    Bottom line. Don’t give them your money.

  22. meretuttechooso Avatar

    Report to the local authorities. Usually a state board. Not now, not tomorrow, yesterday.

  23. BBQBiryani Avatar

    That’s not okay. Maybe she’s a new therapist and needed some help? But it’s unethical, and AI is not a trained or licensed profession. Honestly you’d be justified in going to her supervisor so that this gets nipped the the bud before her colleagues or other professionals try to do the same thing. AI in the wrong hands is so dangerous.

  24. NotDido Avatar

    This is appallingly awful and I am genuinely floored. For what it’s worth, it sounds like the work you’ve been putting in is genuine, even if the feedback you got was coming from a shitty therapist. I hope this ends up being just a blip in the past for you. (Also, maybe check out zoom therapy options? I do mine on my lunch break once a week, and my therapist says it’s not unusual for 9-5ers to do that.)

  25. Wrath_Of_Aguirre Avatar

    Every time you see one of these—that’s when you know that shit is AI generated.

  26. BondJames_007 Avatar

    Dump this therapist and raise a complaint against this therapist. She used AI against your consent for your therapy.

    And I doubt she’s a real therapist. She’s faking it.

  27. IamMeanGMAN Avatar

    I use Talkspace. Their Notice of Privacy Practice for Members discloses the use of AI Tools for insight and assistive use. It’s built into their application that they use for scheduling to document and provide case summaries and assist with other aspects of your care. They’re not going to ChatGPT or an external site, it’s built into their internal platform and while it leverages OpenAI the data is not shared to outside sources.

    My therapist was actually one that brought it to my attention because I work in tech. I got laid off a few months ago because I’d been pushing back on getting familiar with AI on the platform I support, it was either adapt or become irrelevant. I’ve had to make concessions since then, otherwise I’ll never work in tech again and I’ll be homeless, widowed and broke.

    They’re not a fan (for similar reasons) but I already assumed that AI was being leveraged at some capacity. Their responses have been genuine and they chooses not to use the tool (but they do bring up notes that they find interesting). Others may adopt it, but it SHOULD be at their discretion. “Clinician-Led Tools: AI tools are designed to support and empower therapists, not replace them.”

    The subject of “limerence” came up during a session with my therapist and I used AI to expand more on the topic after our video call was done. It was enlightening, and gave me a lot of insight into what I was going through. I still feel weird about it.

    Bring it up the use of AI with your therapist, first and foremost they should be focused on your care and well-being. You have every right to ask them not to cut & paste an AI response and be more forthcoming with you. If they can’t accommodate that for you, you can try another therapist that can be more personable.

    Having just dealt with someone in my life that has done through SA and trust issues, I understand how hurtful this can be. It’s good that you are making the attempt to talk to someone, don’t let the tools they are using distract you from your healing. But certainly bring it up with them. I hope it works out for you.

  28. Floomby Avatar

    The thought of ChatGOT harvesting people’s real pain to learn how to be a pretend human horrifies me. The ways in which the information can be abused are legion. I am so sorry this happened to you, OP. What a massive violation.

  29. Teafork1043 Avatar

    So nobody here writing emails or messages has had AI grammar check their stuff? For all we know they wrote it and just had AI fix certain things . I did it all the time in IT :v

  30. LegitimateGolf113 Avatar

    My therapist asked if she could use AI and I declined. She respected my wishes but didn’t understand why I said no. I tried to explain that it bothers me that something else is listening and interpreting my deepest feelings and thoughts. As a therapist myself, I feel that it’s best to write my own notes anyway. Reviewing them helps me spot patterns as well. I also worry that AI will eventually replace human therapists the more it learns.

  31. Background_Mistake76 Avatar

    you need to report this

  32. fireflysz Avatar

    Maybe it’s because i had a terrible experience but fuck talkspace… it was good to get me started with help but I learned the hard way it shouldn’t have been permanent. My therapist that i saw for one session sucked ass and my psychiatrist ended up yelling at me through voice memos. Well at least i can confirm she wasn’t an AI bot.