comScore ChatGPT Diet Leads To Rare Poisoning

Gujarat News, Gujarati News, Latest Gujarati News, Gujarat Breaking News, Gujarat Samachar.

Latest Gujarati News, Breaking News in Gujarati, Gujarat Samachar, ગુજરાતી સમાચાર, Gujarati News Live, Gujarati News Channel, Gujarati News Today, National Gujarati News, International Gujarati News, Sports Gujarati News, Exclusive Gujarati News, Coronavirus Gujarati News, Entertainment Gujarati News, Business Gujarati News, Technology Gujarati News, Automobile Gujarati News, Elections 2022 Gujarati News, Viral Social News in Gujarati, Indian Politics News in Gujarati, Gujarati News Headlines, World News In Gujarati, Cricket News In Gujarati

Vibes Of India
Vibes Of India

ChatGPT Diet Chart Leads To Rare Poisioning

| Updated: August 11, 2025 18:39

2

In a strange and troubling case, a man developed a life-threatening condition after following health advice from ChatGPT.

Doctors believe it may be the first recorded instance of bromide poisoning linked to artificial intelligence, according to a report in one of the prominent online media outlets.

The case that was published by University of Washington doctors in Annals of Internal Medicine: Clinical Cases describes how the man consumed sodium bromide for three months after ChatGPT reportedly suggested it as a safe alternative to chloride.

Apparently, the AI failed to warn him of the dangers of bromide, which is not safe for human consumption.

According to available information, bromide compounds were once used in medications for anxiety and insomnia but were largely discontinued decades ago due to their harmful side effects. Today, bromide is mostly found in veterinary drugs and industrial products. Cases of bromide poisoning—known as bromism—are now extremely rare.

The man first visited an emergency room with paranoid delusions, convinced his neighbour was poisoning him. Though some of his vital signs appeared normal, he was hallucinating, refused water despite being dehydrated, and rapidly deteriorated into a psychotic episode. Doctors placed him under involuntary psychiatric care.

After receiving intravenous fluids and antipsychotic medication, his condition began to improve. Once he was stable, he revealed that his symptoms began after acting on dietary advice from ChatGPT. Concerned about his sodium intake, he had asked the AI for alternatives to chloride. It allegedly recommended bromide—without warning of the serious health risks.

Although the original conversation wasn’t available, doctors replicated the query and found that ChatGPT did, in fact, mention bromide as a potential substitute, omitting any warning that it is unsafe for human consumption. Experts say the case highlights the dangers of AI providing information without proper context or understanding of health implications.

Ahmedabad-based dietician and nutritionist, Dr. Shipra Bhatnagar, told Vibes of India that this particular case was like several other people looking up to ChatGPT for solutions without realising the side effects it could probably cause.

Diet cannot be generalised, she said, adding that while it may help for a short period of time but in the longer run it does not help.

According to Dr. Bhatnagar, diets are personalised based on BMR, body composition, food habits and many more things.

Unguided dieting can also lead to protein deficiency and poor immunity, she warned.

The patient recovered after a three-week hospital stay. Doctors caution that while AI tools can make scientific information more accessible, they are no substitute for professional medical guidance—and, in some cases, can be dangerously misleading.

Also Read: Parliament Passes Controversial Bills Amid Chaos And Opposition Protests https://www.vibesofindia.com/parliament-passes-controversial-bills-amid-chaos-and-opposition-protests/

Your email address will not be published. Required fields are marked *