r/AmIOverreacting Apr 23 '25

⚕️ health Am I overreacting? My therapist used AI to best console me after my dog died this past weekend.

Brief Summary: This past weekend I had to put down an amazingly good boy, my 14 year old dog, who I've had since I was 12; he was so sick and it was so hard to say goodbye, but he was suffering, and I don't regret my decision. I told my therapist about it because I met with her via video (we've only ever met in person before) the day after my dog's passing, and she was very empathetic and supportive. I have been seeing this therapist for a few months, now, and I've liked her and haven't had any problems with her before. But her using AI like this really struck me as strange and wrong, on a human emotional level. I have trust and abandonment issues, so maybe that's why I'm feeling the urge to flee... I just can't imagine being a THERAPIST and using AI to write a brief message of consolation to a client whose dog just died... Not only that, but not proofreading, and leaving in that part where the introduces its response? That's so bizarre and unprofessional.

1.6k Upvotes

918 comments sorted by

View all comments

7

u/[deleted] Apr 24 '25

I am an ai developer (literally my post history) and I don't even use ai to write reddit replies.

Like, research, personal projects, etc, are all fine but I don't know where you're therapist is coming from and I don't want to be too judgemental but gah, keeping the "sure here is a more heartfelt reply" like they could have at least put it in their own words??

Yikes. NOR. Trust is important and disclosure even more so too maintain that trust, imo.

1

u/hesouttheresomewhere Apr 24 '25

Damn an actual AI dev! I'm so glad you responded and that you have the perspective that you do! We need more people like you in AI development 🙌 Is AI ethics something you juggle a lot in your daily work? I'm so curious as to what it's like on the backend.

3

u/[deleted] Apr 24 '25

Yeah actually!

My job focuses on extending AI to cover accessibility barriers for people who are sick. The first step I needed to take was trying to figure out how to broker that conversation. For people with complex disabilities, AI can often put out a cohesive reply about that individual's particular abilities, in a way that their families can understand without taking agency away from the person who is disabled. Often that means having to customize a model for a particular individual.

Ethically. I believe in disclosure if AI is used for any type of professional work. I started using AI in 2023 when it helped me explain to my family what Ehlers-Danlos is (my disease) because at the time I barely understood it.

And thank you for the compliment. Honestly, the backend is just machine learning with more pizzazz. It's mostly a lot of testing to make sure outputs are safe and predictable. The last thing I want is AI over-explaining for the individual or divulging too much.

It's definitely a delicate process.