A 60-year-old man in New York, USA, nearly died after following advice he received from ChatGPT. The individual had consulted ChatGPT on how to remove salt (sodium chloride) from his diet, having read that salt negatively impacts health.
Following the Artificial Intelligence (AI) chatbot's suggestion, the man ceased using salt entirely, replacing it with sodium bromide.
Sodium bromide was commonly used in medications around 1900, but in large quantities, it acts as a poison.
This case was reported in the American College of Physicians journal. The case report details that the 60-year-old had been using sodium bromide for the past three months based on ChatGPT's advice.
Upon falling ill and being hospitalised, he accused his neighbour of poisoning him.
Doctors were initially puzzled by his condition. When initially asked about supplements or medication, he denied taking any. However, upon admission, he revealed his strict control over his diet and homemade water preparation, prompting further investigation.
During his hospital stay, he exhibited dermatological problems and severe neuropsychiatric symptoms, including hallucinations. The case report noted his intense thirst, yet his avoidance of hospital-provided water.
The report further states that he was treated with fluids and electrolytes, eventually stabilising enough for transfer to the hospital's psychiatric unit.
The report concludes that the patient developed bromism after seeking dietary advice from ChatGPT, where he had read that chloride could be replaced with bromide to maintain health.
Published on:
10 Aug 2025 11:22 am