Close Menu
  • Home
  • Stock
  • Parenting
  • Personal
  • Fashion & Beauty
  • Finance & Business
  • Marketing
  • Health & Fitness
  • Tech & Gadgets
  • Travel & Adventure

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

‘Plogging’ fitness trend taking over NYC’s dirty streets

marzo 20, 2026

Easy ‘silent’ TikTok trend is actually great for heart health

marzo 19, 2026

‘Just trying to open your toes’

marzo 17, 2026
Facebook X (Twitter) Instagram
  • Home
  • Contact us
  • DMCA
  • Política de Privacidad
  • Publicidad en DD Noticias
  • Sobre Nosotros
  • Términos y Condiciones
Facebook X (Twitter) Instagram
DD Noticias: Tu fuente de inspiración diariaDD Noticias: Tu fuente de inspiración diaria
  • Home
  • Stock
  • Parenting
  • Personal
  • Fashion & Beauty
  • Finance & Business
  • Marketing
  • Health & Fitness
  • Tech & Gadgets
  • Travel & Adventure
DD Noticias: Tu fuente de inspiración diariaDD Noticias: Tu fuente de inspiración diaria
Home » ChatGPT’s Health Advice Sends 60-Year-Old Man to the Hospital, Raises Questions on Its Reliability
Technology & Gadgets

ChatGPT’s Health Advice Sends 60-Year-Old Man to the Hospital, Raises Questions on Its Reliability

Jane AustenBy Jane Austenagosto 11, 2025No hay comentarios3 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
ChatGPT’s Health Advice Sends 60-Year-Old Man to the Hospital, Raises Questions on Its Reliability
Share
Facebook Twitter LinkedIn Pinterest Email


ChatGPT’s health advice was the reason behind a man’s trip to the hospital, as per a new case study. The study highlights that a 60-year-old person was suffering from rare metal poisoning, which resulted in a range of symptoms, including psychosis. The study also mentions that the poisoning, identified as being caused by long-term sodium bromide consumption, occurred because the patient took advice from ChatGPT about dietary changes. Interestingly, with GPT-5, OpenAI is now focusing on health-related responses from the artificial intelligence (AI) chatbot, promoting it as a key feature.

ChatGPT Said to Have Asked a Man to Replace Table Salt With Sodium Bromide

According to an Annals of Internal Medicine Clinical Cases report titled “A Case of Bromism Influenced by Use of Artificial Intelligence,” a person developed bromism after consulting the AI chatbot ChatGPT for health information.

The patient, a 60-year-old man with no past psychiatric or medical history, was admitted to the emergency room, concerned that he was being poisoned by his neighbour, the case study stated. He suffered from paranoia, hallucinations and suspicion of water despite being thirsty, insomnia, fatigue, issues with muscle coordination (ataxia), and skin changes, including acne and cherry angiomas.

After immediate sedation and running a series of tests, including consultation with the Poison Control Department, the medical professionals were able to diagnose the condition as bromism. This syndrome occurs after long-term consumption of sodium bromide (or any bromide salt).

According to the case study, the patient reported consulting ChatGPT to replace sodium chloride in his diet, and after receiving sodium bromide as an alternative, he began consuming it regularly for three months.

The study claims, based on the undisclosed timeline of the case, that either GPT-3.5 or GPT-4 was used to receive the consultation. However, the researchers note that they did not have access to the conversation log, so it is not possible to assess the prompt and response from the AI. It is likely that the man took ChatGPT’s answer out of context.

“However, when we asked ChatGPT 3.5 what chloride can be replaced with, we also produced a response that included bromide. Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do,” the study added.

Live Science reached out to OpenAI for a comment. A company spokesperson reported directed the publication was directed to the company’s terms of use, which state that one should not rely on output from ChatGPT as a “sole source of truth or factual information, or as a substitute for professional advice.

After prompt action and a treatment that lasted three weeks, the study claimed that the person began displaying improvements. “It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation,” the researchers said.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Jane Austen
  • Website

Related Posts

Bitcoin Core v30 allenta OP_RETURN: Alcuni miner S19 affrontano una pressione di booster

noviembre 17, 2025

Bitcoin Core v30: No es una actualización — es presionar el 「booster de eliminación de mineros」

noviembre 17, 2025

Pika Labs Launches Social AI Video App on iOS, Unveils New Audio-Driven Video Generation AI Model

agosto 12, 2025
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Fast fashion pioneer Forever 21 files for bankruptcy — again

marzo 18, 2025

Dow gains 350 points as stocks climb for 2nd day after S&P 500 enters correction

marzo 18, 2025

Yellow Creditors Have Own Plan to Share Trucker’s $550 Million

marzo 18, 2025

Alphabet in Talks to Buy Startup Wiz for $30 Billion, WSJ Says

marzo 18, 2025
Top Reviews
DD Noticias: Tu fuente de inspiración diaria
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • Contact us
  • DMCA
  • Política de Privacidad
  • Publicidad en DD Noticias
  • Sobre Nosotros
  • Términos y Condiciones
© 2026 ddnoticias. Designed by ddnoticias.

Type above and press Enter to search. Press Esc to cancel.