Chatgpt

GP Issues Urgent Warning After Man Develops Rare Condition Following ChatGPT Query

Dr Rupa Parmar, GP and Medical Director at Midland Health, is warning the public about the risks of relying on tools like ChatGPT for health advice, after a 60-year-old man developed a rare and dangerous condition by following AI-generated guidance.

A 60-year-old man developed bromism, also known as bromide toxicity, after consulting ChatGPT about eliminating table salt (sodium chloride) from his diet. The man, after reading about the negative effects of sodium chloride, sought advice from ChatGPT on how to remove chloride and subsequently took sodium bromide over a three-month period. 

This led to him presenting at a hospital with symptoms like paranoia, trying to escape, and later, facial acne, excessive thirst, and insomnia, before being treated for psychosis.  Sodium bromide was used as a sedative in the early 20th century, and a medical professional would be highly unlikely to suggest sodium bromide as a replacement for table salt

In response to the case, Dr Rupa Parmar said: “This unfortunate case is a harsh reminder of the dangers of relying on tools like ChatGPT for health advice. Chatbots can produce scientific inaccuracies. After all, the internet is full of medical myths, and AI is only as reliable as the information it’s trained on. This means that in many cases, AI provides users with information without proper risk warnings, fuelling the spread of misinformation and contributing to a number of preventable health issues. 

“Your health is far too important to gamble on unverified advice. Whether it’s a new symptom, a dietary change, or any other medical concern, you can’t just take the AI’s word for it. You should always speak to a qualified healthcare professional, and be open with them about where you’ve been getting your information. They will be able to fill in the blanks and provide you with a complete picture of your health, advising you on your next steps and potential treatments.

“While AI is an impressive tool, it’s not a doctor. It can’t ask follow-up questions, pick up on subtle warning signs, or judge when something needs urgent attention. Plus, if you feed it incomplete information, it can give you advice that’s not just wrong, but potentially dangerous.

“For those who struggle to get a GP appointment, it’s best to explore other safe options rather than turning to AI for diagnosis. For example, NHS 111 is available day and night. Plus, pharmacists can give expert advice on many minor illnesses, while walk-in centres are there for issues that can’t wait but aren’t life-threatening. And if it’s an emergency, don’t delay. Call 999 or go to A&E immediately.”