Folk are getting dangerously attached to AI that always tells them they’re right
Sycophantic, but also “lawsuit avoidant”.
I was released from the hospital following surgery last month and I had a bleeding “event”. I use the word “event” because it sounds more festive.
Shortly after that, I went to the bathroom, the bleeding seemed to have stopped.
Just for fun, I thought I’d ask ChatGPT what it thought, telling it the nature of the surgery, the bleeding event, the non-bleeding event, and asking it “So… best of three?”
And it went HARD on “this is not a best of three scenario! Call 9-1-1! Do it now! You could pass out! Call 9-1-1!”
I did not call 9-1-1. The bleeding did not resume, I’m fine.
Happy… bleedivus, I guess!
But seriously, I don’t think the AI was very wrong here, depending on how severe the bleeding was? Did the doctors say anything?
Normal post surgical stuff after, you know, getting gutted like a fish. 😉 Stage 2 colon cancer surgery.
Oof, I hope everything turns out great for you! (And since I just now noticed your username: thank you for everything you’ve been doing for us!)
FTFY: “Sycophantic behavior in AI affects
usall users of AI.” If you don’t touch it, it can’t touch you.I don’t know. When it tells me it can’t show me the picture I asked for because of copyright guardrails, I just get kind of frustrated.



