Reddit’s conversational AI product, Reddit Answers, suggested users who are interested in pain management try heroin and kratom, showing yet another extreme example of dangerous advice provided by a chatbot, even one that’s trained on Reddit’s highly coveted trove of user-generated data.
https://en.wikipedia.org/wiki/Bromism
However, a man was poisoned in 2025, after a suggestion of ChatGPT to replace sodium chloride in his diet with sodium bromide; sodium bromide is a safe replacement only for non-nutritional purposes, i.e., cleaning.[3][4][5]
Y’all ever read that thread of the guy getting addicted to heroin? Truly surreal.
Just a bored guy decides to get something new from his dealer and posts about it on reddit. The next 2 years comments are a cautionary tale.
/U/SpontaneousH for anyone morbidly curious.
Alternate link to somewhat prevent Google from interlinking us with you quite so tightly
Original reddit link:
Automated summary: