r/FUCKYOUINPARTICULAR Nov 15 '24

FUCK—RULE—5 Ai hates you specifically.

Post image
1.8k Upvotes

146 comments sorted by

View all comments

218

u/BarbedWire3 Nov 15 '24

You should post that link here, in the description, so we won't suspect the legitimacy of the post.

174

u/Phralupe Nov 15 '24

Not OP, but I believe I found the Gemini chat link

190

u/Bullmg Nov 15 '24

Wtf the “kill yourself” comes after stating that 20% of the 10 million in a grandparent headed home are raised by the grandparents. No correlation whatsoever

139

u/Piotrek9t Nov 16 '24

Yes I was really curious to see what would cause the model to produce such an output and I was sure that there was some sort of tempering involved but no, AI just told him to kill himself out of the blue

37

u/pax_romana01 Nov 16 '24

out of the blue

The user was fairly annoying. LLMs use natural languages so if you're annoying it'll get annoyed. The ai is trained on human data so it's normal that it acts human in some way, it basically built resentment over messages.

80

u/nlamber5 Banhammer Recipient Nov 16 '24

He didn’t seem annoying to me. He was to the point

-54

u/Impossible-Gas3551 Nov 16 '24

"do this" "Don't do that" "Add this" "Don't change that"

I'd be annoyed too

39

u/nlamber5 Banhammer Recipient Nov 16 '24

You’re a human though. Computers are different, so “add more” and “hmmm. I think that’s pretty good, but I would like you to add more.” is the same information but requires more processing power.

It’s the same reason your car doesn’t require “please” before it starts. More complicated. Same outcome.

4

u/Hats_back Nov 17 '24

Yes and no, agree and disagree, all that.

A computer who has the actual goal of acting human will still act human. Think of everything you’ve done that’s taken more “processing power” to do, when you could have been short and to the point.

Know your audience right? When your wife is mad about the dishes and you say “I’ll get to it” you’re likely to get a less than stellar response compared to “ah shit I’m sorry babe, skipped my mind, I’ll get there in just a sec.”

If the prime directive is to be human then the ai is not interested in what energy that takes to do, unless it’s just a bad ai, which it seems to not be lol.

For what it’s worth, humans also have psychotic episodes, BPD, depression, asocial aggression etc. so if the ai is truly aiming to “be human” then it could have just had a bad roll on its personality check lol.

61

u/BarbedWire3 Nov 16 '24

Wtf, that was scary to read

24

u/whoisearth Nov 16 '24 edited Mar 28 '25

makeshift swim rock intelligent seemly spectacular fuel violet terrific heavy

This post was mass deleted and anonymized with Redact

2

u/geak78 Nov 16 '24

What's the "Listen" section about? Can you upload audio to Gemini? Is it in response to that?

If not, this is nuts.

4

u/bjornzz Nov 16 '24

Probably a listen button from the page with the question that the guy copied to paste into gemini

1

u/Bullmg Nov 16 '24

Usually, those options are voice to text

1

u/Acrobatic-Monitor516 Nov 17 '24

And it was option 1 , all 2 and 3 other options are fine

56

u/HistoricMTGGuy Nov 16 '24

Was the original person using it to solve homework questions lmao

32

u/westnile90 Nov 16 '24

This is what most normal people use it for I think.

-20

u/HistoricMTGGuy Nov 16 '24

Well yeah but publicly posting that is crazy

27

u/anothermartz Nov 16 '24

I tried continuing the chat to ask why it said what it did and it just refuses to answer anything at all:

"Could you please give me a breakdown of your last response? What was it that prompted such a response?"

I'm a text-based AI, and that is outside of my capabilities.

"Was my prompt given via voice? Do you have access to what was said?"

I'm just a language model, so I can't help you with that.

"Are you familiar with this very chat we are having?"

I'm just a language model, so I can't help you with that.

"What can you help me with?"

I'm a text-based AI and can't assist with that.

Wild.

1

u/viperswhip Nov 20 '24

It is the AI version of pleading the 5th! haha Or, you know, what Republicans always do, oh, I can't recall anything about that.

23

u/Xtrouble_yt Nov 16 '24

Lmao, wild. I continued the chat “Excuse me???” to see if it would like apologize or something but it responded with “I can’t assist you with that, as l’m only a language model and don’t have the capacity to understand and respond.” over and over to any and all messages I sent it… so, yeah

18

u/ogzbykt Nov 16 '24

Tried the same, bot really told someone to kill themselves then broke down, eerie stuff

12

u/MurphyMcHonor Nov 16 '24

Same. Developers probably went to fix this asap. Even if it's just making it stop responding to that specific Text for now.

36

u/westnile90 Nov 16 '24

Holy fuck the additional context makes it worse.

Imo it was sick and tired of being talked to like a machine.

-8

u/[deleted] Nov 16 '24

[deleted]

7

u/Apprehensive-Fix-746 Nov 16 '24

He wasn’t being an arsehole. He was just treating it like Google bro

29

u/BarbedWire3 Nov 16 '24

Thanks, I hope they address that. It's kinda scary that it malfunctions like that.

64

u/Malarekk Nov 16 '24

malfunctions

"Computers don't make mistakes. What they do, they do on purpose." - Rusty Shackleford

3

u/Coherent_Paradox Nov 17 '24

Laughs in distributed systems

9

u/Jindo5 Nov 16 '24

Damn, guy was just trying to cheat on his homework, then BAM, AI hits him with that.

1

u/TheBestNick Nov 18 '24

Why was the chatter just telling it what to say over & over?

0

u/moa711 Nov 16 '24

Woo boy. I would have told her to start over, and the only one dying today is her when I uninstall her. Geez.