Close Menu
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Is Tinder the New LinkedIn? Job-Hunters Swipe for Leads on Dating Apps

February 15, 2026

Gary Marcus Says AI Fatigue Won’t Hit Every Kind of Job

February 15, 2026

The great computer science exodus (and where students are going instead)

February 15, 2026
Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
Home » Man took diet advice from ChatGPT, ended up hospitalized with hallucinations
Health

Man took diet advice from ChatGPT, ended up hospitalized with hallucinations

IQ TIMES MEDIABy IQ TIMES MEDIAAugust 13, 2025No Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


A man was hospitalized for weeks and suffered from hallucinations after poisoning himself based on dietary advice from ChatGPT.

A case study published on Aug. 5 in the Annals of Internal Medicine, an academic journal, states the 60-year-old man decided he wanted to eliminate salt from his diet completely. To do so, he asked ChatGPT for an alternative to salt, or sodium chloride, to which the AI chatbot suggested sodium bromide, a compound historically used in pharmaceuticals and manufacturing.

While the journal noted that doctors were unable to review the original AI chat logs and that the bot likely suggested the substitution for another purpose, such as cleaning, the man purchased sodium bromide and used it in place of table salt for three months.

As a result, he ended up in the ER with paranoid delusions, despite having no history of mental health problems. Convinced that his neighbor was poisoning him, the man was reluctant to even accept water from the hospital, despite reporting extreme thirst. He continued to experience increased paranoia, as well as auditory and visual hallucinations, eventually landing him an involuntary psychiatric hold after he tried to escape during treatment.

What was happening to the man?

Doctors determined that the man was suffering from bromide toxicity (or bromism), which can result in neurological and psychiatric symptoms, as well as others experienced by the man, including acne and cherry angiomas (bumps on the skin), fatigue, insomnia, subtle ataxia (clumsiness) and polydipsia (excessive thirst).

Other symptoms of bromism can include nausea and vomiting, diarrhea, tremors or seizures, drowsiness, headache, weakness, weight loss, kidney damage, respiratory failure and coma, according to iCliniq.

Bromism was once far more common due to bromide salts having been used in everyday products. In the early 20th Century, it was used in over-the-counter medications, often resulting in neuropsychiatric and dermatological symptoms, according to the study’s authors. Incidents of such poisoning saw a sharp decline when the Food and Drug Administration phased out the use of bromides in pharmaceuticals between the mid 70s and late 1980s.

The man was treated at the hospital for three weeks, over which time his symptoms progressively improved.

A man landed in the hospital after taking dietary advice from ChatGPT.

A man landed in the hospital after taking dietary advice from ChatGPT.

USA TODAY reached out to OpenAI, the maker of ChatGPT, for comment on Wednesday, Aug. 13, but has not received a response.

The company provided Fox News Digital with a statement, saying, “Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice. We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance.”

AI can ‘fuel the spread of misinformation,’ doctors say

Doctors involved in the case study said they suspected that the patient had used ChatGPT version 3.5 or 4.0, the former of which they tested in an attempt to replicate the answers the man received. While the study’s authors noted that they couldn’t know exactly what the man was told without the original chat log, they did receive a suggestion for bromide as a replacement for chloride in their tests.

“Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do,” said study authors Dr. Audrey Eichenberger, Dr. Stephen Thielke and Dr. Adam Van Buskirk.

AI carries the risk of providing information like this without context, according to the doctors. For example, it is unlikely that a medical expert would have mentioned sodium bromide at all if a patient asked for a salt substitute.

“Thus, it is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation,” according to the study.

This article originally appeared on USA TODAY: Man hospitalized after taking ChatGPT diet advice, study says



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
IQ TIMES MEDIA
  • Website

Related Posts

Whether it’s a mini-sabbatical or an adult gap year, more people are taking extended work breaks

February 14, 2026

Decluttering can be stressful − a clinical psychologist explains how personal values can make it easier

February 14, 2026

He got cancer, then his wife did, too. Their love survived.

February 14, 2026
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Social media posts extend Epstein fallout to student photo firm Lifetouch

February 13, 2026

Jury deadlocks in trial of Stanford University students after pro-Palestinian protests

February 13, 2026

Harvard sued by Justice Department over access to admissions data

February 13, 2026

San Francisco teachers reach deal with district to end strike

February 13, 2026
Education

Social media posts extend Epstein fallout to student photo firm Lifetouch

By IQ TIMES MEDIAFebruary 13, 20260

MALAKOFF, Texas (AP) — Some school districts in the U.S. dropped plans for class pictures…

Jury deadlocks in trial of Stanford University students after pro-Palestinian protests

February 13, 2026

Harvard sued by Justice Department over access to admissions data

February 13, 2026

San Francisco teachers reach deal with district to end strike

February 13, 2026
IQ Times Media – Smart News for a Smarter You
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 iqtimes. Designed by iqtimes.

Type above and press Enter to search. Press Esc to cancel.