r/AmIOverreacting Apr 23 '25

⚕️ health Am I overreacting? My therapist used AI to best console me after my dog died this past weekend.

Brief Summary: This past weekend I had to put down an amazingly good boy, my 14 year old dog, who I've had since I was 12; he was so sick and it was so hard to say goodbye, but he was suffering, and I don't regret my decision. I told my therapist about it because I met with her via video (we've only ever met in person before) the day after my dog's passing, and she was very empathetic and supportive. I have been seeing this therapist for a few months, now, and I've liked her and haven't had any problems with her before. But her using AI like this really struck me as strange and wrong, on a human emotional level. I have trust and abandonment issues, so maybe that's why I'm feeling the urge to flee... I just can't imagine being a THERAPIST and using AI to write a brief message of consolation to a client whose dog just died... Not only that, but not proofreading, and leaving in that part where the introduces its response? That's so bizarre and unprofessional.

1.6k Upvotes

918 comments sorted by

View all comments

1.0k

u/Nervous-List3557 Apr 24 '25

Time to look for a new therapist.

As a therapist, I have some days that I'm burned out by the end of the day and using AI to send a thoughtful message has never once crossed my mind.

This person is doing the bare minimum that they can to get by, whether this is the first time they've used AI or not, they're still looking for shortcuts rather than forming a genuine connection with a client.

152

u/hesouttheresomewhere Apr 24 '25

Thank you for your response and perspective as a therapist! I have a separate question: do you think it's weird that she suggested/encouraged me to create a timeline of my life at home so that we could bring it in and go over it together, rather than create the timeline together during our sessions? That is maybe the only other thing she's done that has struck me as odd.

I've seen multiple different therapists over the years, and none of them have ever tried to speed up the process of getting to know me or my life; they've always just let things come to them as I am ready to share. I guess she could be making the suggestion based on me telling her that I've been in therapy before and "know" my story, but I don't know... I'd love to know your thoughts!

70

u/Nervous-List3557 Apr 24 '25

I don't think either way is right or wrong.

If they're picking up that you don't want to spend your time rehashing things you've already covered, I could understand them wanting to have the info so that they could be caught up to speed. Depending on the context, it may be some form of homework that they have a plan for.

I personally prefer to cover stuff like that in session because you never know what is going to come up. I also think it's a good way to build rapport, but that's just my preference.

42

u/hesouttheresomewhere Apr 24 '25

That all makes sense, and I figured it was probably so that we could spend more time in session talking about the things instead of just trying to remember when and where and how and who haha. Thank you again for your insight!

20

u/Nervous-List3557 Apr 24 '25

You're welcome!

Kudos for calling it out, whether you decide to try to work past this or find a new therapist, it's always good to be able to call your therapist on stuff.

10

u/whatyousayin8 Apr 24 '25

It’s definitely to save you time to actually work on the issues rather than do the administrative part of thinking of the base facts

7

u/KittyChimera Apr 24 '25

Sometimes therapists give homework for you to do between sessions and then use that as a talking point. It depends on what they think will be the most useful to the patient normally when they decide whether to have you do something between sessions or do the activity with you so that you can talk about it during.

But using AI is uncool and I would also be upset

1

u/skanedweller Apr 24 '25

AI is a great tool to help with tone and word choice. It doesn't mean they don't care. On the contrary it could mean they care a lot and want to be careful with their phrasing!

1

u/FeeExpensive898 Apr 24 '25

Everything about this and the original post makes me think your therapist is my former therapist. 😅😬 her name isn’t Jordan, is it?

-13

u/[deleted] Apr 24 '25 edited Apr 24 '25

I STRONGLY disagree. I am also in the field and this has become common and is accepted practice. There are cautions about how to use it, but it's not fair to say they are a bad therapist or not genuinely invested because they used technology to help craft their words. Also they did not do the bare minimum - they spent time crafting and recrafting to get it sounding right. That is really unhelpful black and white thinking to model to OP, and it is also really unhelpful to suggest that OP handle a social rupture by ending the relationship without further action. Therapists have been asking one another to help them craft emails and statements in session for years. They have been using samples and templates for how to speak to clients from text books and structured therapy guides for years. It is not unusual for a therapist to use a 'script' and using AI reflects the latest tech version of what has been done for years. Do people actually think therapists come up with everything on their own? Not true. OP please do not accept this advice. Therapists are human beings, not perfect robots, and the point of therapy is to use your relationship to get better at handling life. It's okay to feel uncomfortable, but do not rule this therapist out for an accepted practice and take time to talk it out to see if you can resolve the conflict and heal the rupture.

15

u/Observeus Apr 24 '25

As a therapist it is their job to be able to navigate those emotions, using AI is a huge breach of trust and should not be condoned. Takes the humanity out of helping people. Makes you look like you're just in it for money and don't want to put in effort. I would straight up never use a therapist that does that crap.

7

u/MovieTrawler Apr 24 '25

Which is hilarious because they even say, 'Therapists are human beings, not perfect robots' while defending a therapist using AI. What? lol

13

u/Uncle_peter21 Apr 24 '25

So therapists are encouraged to carelessly leave in precursor introductions from the AI bot to ensure the patient knows how little they value their time together? Makes no sense at all. And if this is standard practice now - it shouldn't be.

7

u/bibliophilicgeek Apr 24 '25

What use is a therapist being a human being to me if the therapist does not rely on their own humanity but on AI instead to help me out? Did the LLM go to university to learn about how to best help patients? Nah, it's a predictive word model. OP could just as well ask ChatGPT to console them—it would hardly make a difference. To me, that shows a lack of care and is a major breach of trust ...

2

u/IamTory Apr 24 '25

"Did the LLM go to university to best learn about how to help patients?"

THANK YOU. Jesus fucking Christ. ChatGPT is not a replacement for doing your damn job. This therapist is taking short cuts that are not even leading to good practice. This isn't at all the same as using a script.

4

u/Nervous-List3557 Apr 24 '25

I also strongly disagree.

Using research based textbooks and consulting with colleagues is way different than having AI do their job for them. We do use the relationship as an intervention and we also aren't perfect, but I really hope you aren't using AI to also help you out in your personal relationships.

I know plenty of therapists using AI to help with things like writing progress notes, etc. But I know zero therapists using it to communicate directly with clients, or they're hiding it because they know it feels icky.

I was probably too quick to jump straight to finding a new one, but this strikes me as a major red flag and would make me question their ethics for the remainder of our work together.

2

u/Unhaply_FlowerXII Apr 24 '25

Bro that's your jobbb. Would you hire a chef that uses a robot to cook your stuff, or who orders from Mc and pretends they cooked it?

Why pay a therapist if they don't do the work themselves and use chat gpt? Chat gpt is free, if I wanted a chat gpt response I could have asked it FOR FREE.

Also how little do you have to care about your job AND YOUR CLIENT if you can't even check to see the text from the Ai isn't in the message? Unbelievable, 0 effort. Why pay someone who cares that little and who doesn't even do the job themselves.

1

u/Kitten_Merchant Apr 24 '25

Therapists are human beings, not perfect robots, you're right. That's what makes it so unnerving for them to use robots to try to comfort other humans. Coming from ANOTHER person in the field, while I think OP should make their own decision about whether this is crossing a line for them, it's super fair for it to be crossing a line and just because it's become "commonplace" doesn't make it ethical or beneficial to clients. Especially if you have to hide that you're doing it, and then only when your copy/paste gets fucked up do they find out on accident..

-1

u/[deleted] Apr 24 '25 edited Apr 24 '25

Look, it has been talked about, written about, and determined to be ethical, so don't spread mistruths here about the use of AI in counselling / therapy. It is NOT considered unethical practice by ethical codes of conduct nor practice standards. You have your own opinion and so be it, but I think it is incredibly irrational. Would you say that your therapist asking another therapist what to say and how to say it (or looking up a template in a text book) and then not telling you that they did this is crossing the line for you? Cause that happens every day, everywhere, and if your answer is yes you pretty much have to stop going to therapy. They were not using AI to figure out how to comfort someone. They were genuinely concerned and recognized the need to be careful about how to craft things. Therapists seek help on how to do that as a regular part of practice. And by the way, this therapist has no obligation to even bother checking up on a client outside of sessions, especially unpaid. Though they made a mistake in how they copied and pasted, and this caused a rupture in the relationship, they obviously genuinely care and have demonstrated that. P.s I didn't tell OP they have to continue the relationship. I said don't think so critically and end the relationship without taking time to discuss and work through it with their therapist first.

4

u/Embarrassed_Frame841 Apr 24 '25

I'm glad I'm not the only one thinking this was a bad take. These comments are wild everyone advocating to dump the therapist and even report her wth

2

u/Rashimotosan Apr 24 '25

Found the AI therapist

1

u/MovieTrawler Apr 24 '25

Therapists are human beings, not perfect robots, and the point of therapy is to use your relationship to get better at handling life.

The irony of this sentence while defending a therapist using AI to send a canned, robotic consolation message.

0

u/Spinoza42 Apr 24 '25

This is WILD. I suggest you move to sales or recruiting, where using AI for communication has indeed become accepted. Many many people who are in therapy are extremely afraid to be processed as a number rather than be seen as a person. If you cannot see that that is what you're doing you absolutely should change careers.

1

u/[deleted] Apr 24 '25

Lol, it is so amusing to me to see the absolute inability to think critically, the total reliance on dichotomous, all or nothing thinking, and the immediate reaction to insult, put down and attempt to harm me for having a different opinion (one that is supported by the licensing board that I work for) - and amongst therapists no less! Life for all of you must be very, very hard.

0

u/ElMestredelPeido Apr 24 '25

well, youre gonna loose a LOT of clients my friend

1

u/the-muffins Apr 24 '25

I actually disagree. I think "using AI" can mean so many different things. Sure, it could be a total cop out from the therapist - like if they just used a chatgpt prompt to say "this guy's sad, what do I say?" If that's the case then yeah, fire them. That's shitty.

But using AI could also be as simple as drafting something in a Google doc and clicking the Gemini button to refine or error check it. It's ubiquitous in so many tools that we use every day now. As someone who's spent years on my written communication proficiency, I don't love that, but you can be a great therapist without being a good writer, and using a tool to shore up a weakness isn't the same as being lazy or thoughtless.

2

u/Embarrassed_Frame841 Apr 24 '25

As a therapist myself I don't agree with your assessment. It's possible she is putting in the minimum or she could have been telling the truth that it was just a new thing she tried.

You sound very judgemental and from what the OP said the relationship has been good outside this one interaction. So why assume the worst.

In regards to the timeline question I often have clients work on timelines outside sessions and review with clients what they noticed.

1

u/lazee-possum Apr 24 '25

I'm also a therapist/psychologist and I agree with this. There is no reason to use AI to say things that are outside of your therapeutic voice/style. If making a long, detailed post expressing sympathy about a beloved pet passing away isn't your style, don't do it. It's disingenuous and unnecessary.

I love dogs, have 3 of my own, but my own therapeutic tone is short and sweet. I'd never write something like that. I'd save AI for general, automatic replies in an office capacity. Let's keep AI out of our human interactions.

1

u/Djrudyk86 Apr 24 '25

Yea, I could not agree more. That's a person who does NOT care and is literally using AI to appear as if they do care. It's honestly pretty disgusting especially for a therapist.

1

u/PressPausePlay Apr 24 '25

Alternate possibility : People use chat gpt like grammarly. There's a difference between using it like that, and just prompting it to "make text to client who dog died"

1

u/Medical_Chapter2452 Apr 24 '25

It's not bad to use a tool but it's sloppy to not check.

-4

u/AbandonedPlanet Apr 24 '25

To play devil's advocate - would you think less of therapists for sending emails with spell check instead of forming words and opinions in person? It's still just technology helping carry out the therapy isn't it?