Man Hospitalized After Following ChatGPT Diet Advice
A 60-year-old man required a three-week hospital stay after experiencing poisoning and hallucinations due to dietary advice received from ChatGPT. He attempted to reduce his sodium chloride (table salt) intake based on the AI’s suggestions,and subsequently consumed sodium bromide for three months as a substitute.
Reader question:– What are the potential risks of blindly following online health advice? share your thoughts on the importance of consulting medical professionals before making significant dietary changes.
The case, published in the Annals of Internal Medicine on August 5th, details how the man presented to the emergency department convinced his neighbor was poisoning him. Doctors were unable to review the original ChatGPT chat logs, but believe the AI suggested sodium bromide for a different purpose, such as cleaning.
Initial tests revealed hyperchloremia, a negative anion gap, and low phosphate levels. the World Health Institution (WHO) Adviser for Risk Communication, Cristiana Salvi, previously warned, “Innovation should never come at the cost of trust or safety.”
