Page 11 - IAV Digital Magazine #624
P. 11

iAV - Antelope Valley Digital Magazine
Patient Hospitalized After Following AI Chatbot’s Dangerous Health Tip
https://www.youtube.com/watch?v=vayh2gYIMSM
Artificial intelli- gence is infiltrat- ing nearly every aspect of daily life. From booking trips to writing emails, AI chat- bots are becom- ing go-to tools for quick answers. The phrase “just ask ChatGPT” has become sec- ond nature for many. But when it comes to health advice, experts warn we may be
leaning too far into uncharted (and unsafe) terri- tory.
One man learned this the hard way. In a case that has alarmed medical professionals and AI sceptics alike, a patient was hospitalised with a rare and dan- gerous form of poisoning after following dietary advice from
ChatGPT.
Trying to cut down on his salt (sodium chloride) intake and improve his over- all well-being, the man consulted the chatbot for guidance. According to a detailed medical report in
the Annals of Internal Medicine, ChatGPT alleged- ly advised him to
substitute regular salt with sodium bromide.
Without further research or con- sulting a medical professional, the man sourced sodium bromide online and began incorporating it into his daily rou- tine.
Sodium bromide is not a dietary supplement. It is commonly used as a water disin- fectant, sanitiser, slimicide, bacteri- cide, algicide, fungicide, and molluscicide con- trol agent. This critical context was reportedly missing from the chatbot’s advice.
Roughly three months after introducing it into his diet, the man began suffering from paranoid delusions – at one point insist-
ing his neighbour was trying to poi- son him.
“In the first 24 hours of admis- sion,” physicians wrote, “he expressed increasing para- noia and auditory and visual halluci- nations, which, after attempting to escape, result- ed in an involun- tary psychiatric hold for grave disability.”
Once stabilised on medication, the patient was able to explain the full context, including the role ChatGPT played in his decision- making. Blood tests revealed he was suffering from bromism, a rare and serious condition caused by the toxic accu- mulation of bro- mide in the body.
Typically, safe
levels of bromide in the blood fall below 10 mg/L. This man’s levels were a staggering 1,700 mg/L.
“It is important to consider that ChatGPT and other AI systems can generate sci- entific inaccura- cies, lack the abil- ity to critically dis- cuss results, and ultimately fuel the spread of misin- formation,” experts conclud- ed in the pub- lished case study.
The incident serves as a sobering reminder: while AI can be a powerful tool, it should not replace medical advice from trained profes- sionals. When it comes to health, trusting a chatbot over a doctor may carry dan- gerous conse- quences.
iAV - Antelope Valley Digital Magazine


































































































   9   10   11   12   13