A 60-year-old man wound up within the hospital after searching for dietary recommendation from ChatGPT and unintentionally poisoning himself.
Based on a report revealed within the Annals of Inner Drugs, the person wished to get rid of salt from his food plan and requested ChatGPT for a alternative.
The substitute intelligence (AI) platform really useful sodium bromide, a chemical typically utilized in pesticides, as an alternative. The person then bought the sodium bromide on-line and changed it with salt for 3 months.
The person finally went to the hospital, fearing his neighbor was attempting to poison him. There, medical doctors found he was affected by bromide toxicity, which brought about paranoia and hallucinations.
Bromide toxicity was extra frequent within the twentieth century when bromide salts have been utilized in numerous over-the-counter drugs. Instances declined sharply after the Meals and Drug Administration (FDA) phased out bromide between 1975 and 1989.
The case highlights the risks of counting on ChatGPT for advanced well being choices with out adequate understanding or correct AI literacy.