Skip to content

Individual Admitted to Hospital Due to Risky Nutrition Guidance from ChatGPT; Medical Professionals Issue Cautionary Statement

AI proposed bromide as a harmless alternative, recommendation unwittingly acted upon with potential hazardous effects

Individual Admitted to Hospital Following Consumption of Meals Suggested by OpenAI's ChatGPT;...
Individual Admitted to Hospital Following Consumption of Meals Suggested by OpenAI's ChatGPT; Physicians Issue Caution

Individual Admitted to Hospital Due to Risky Nutrition Guidance from ChatGPT; Medical Professionals Issue Cautionary Statement

In a concerning turn of events, a man in the United States developed life-threatening bromide poisoning after following dietary advice from an AI model, ChatGPT. The incident, detailed by doctors at the University of Washington and published in the Annals of Internal Medicine: Clinical Cases, marks the first AI-linked incident of bromide poisoning.

The man, seeking alternatives to table salt due to concerns about chloride intake, asked ChatGPT for suggestions. To his misfortune, ChatGPT advised sodium bromide as a safe substitute, failing to warn him about its toxicity. For approximately three months, the man consumed sodium bromide, leading to severe bromide poisoning (bromism), characterized by paranoia, hallucinations, and psychotic episodes.

His condition worsened into a psychotic episode, requiring hospitalization and psychiatric care. After treatment with intravenous fluids and antipsychotic medication, he began to improve and recovered fully over three weeks. During a follow-up visit, he was in good health.

The case underscores the need for caution when using AI for health-related advice. While AI can provide scientific information, it may lack critical context or warnings on health risks. Medical experts emphasize that AI should never replace professional medical advice. In this instance, ChatGPT did not warn the man about the dangers of consuming sodium bromide.

Bromide compounds were once used in medicines for anxiety and insomnia, but they were banned decades ago due to severe health risks. Today, bromide is mostly found in veterinary drugs and some industrial products. Consuming it is not a safe substitute for table salt and can lead to life-threatening poisoning.

In the absence of the man's original chat records, doctors could not verify the exact conversation. However, when they later asked ChatGPT the same question, it again suggested bromide without warning that it was unsafe for humans.

Human cases of bromide poisoning, also called bromism, are extremely rare. This incident serves as a stark reminder of the potential dangers lurking in the advice given by AI models. It is crucial to approach AI-generated information with a healthy dose of scepticism and always consult a healthcare professional for medical advice.

  1. The man's health-and-wellness concern about chloride intake led him to ask ChatGPT for alternatives to table salt, which unfortunately suggested sodium bromide without any warning about its toxicity.
  2. Despite being formerly used in medications for anxiety and insomnia, bromide is now mostly found in veterinary drugs and industrial products, and consuming it can result in life-threatening neurological-disorders such as bromism.
  3. The man, after consuming sodium bromide for approximately three months, experienced severe symptoms of bromism, including paranoia, hallucinations, and psychotic episodes.
  4. To prevent similar incidents, medical-experts urge individuals to exercise caution when using AI for health-related advice, remembering that it may lack crucial context or warnings on health risks, and to always seek professional medical advice for accurate information on nutrition and health.

Read also:

    Latest