Chatgpt’s advice delivered the hospital, know the shocking reason

New Delhi
If you also rely excessively on Chatgpt and take fitness tips or diet plans from AI Chatbot, then be careful. Health advice taken from AI can also endanger life. We are saying this because the advice of ChatjPT has taken a person to the hospital. According to a report, a 60 -year -old man in New York had to be hospitalized when he followed a strict rule to reduce salt from the food mentioned by Chatgpt. According to doctors, the person suddenly reduced the amount of sodium from his diet for several weeks, which reduced the level of sodium dangerously, called hyponatramia. His family said that he trusted the AI Janreded Health Plan without consulting the doctor.

Recently published in the American College of Physician Journal, the case, without professional surveillance, highlights the risks of following the AI generated health advice, especially when it contains essential nutrients such as sodium. It is a matter of relief that the person recovered after spending about three weeks in the hospital. But in this case, the credibility of the Chat GPT has definitely been questioned.

Also Read  PM Modi's response to Jagdeep Dhankhar's resignation: Wishing good health

ChatjiPT’s advice proved to be dangerous
According to a TOI report, the person asked ChatjiePat how to remove sodium chloride (usually called table salt) from his diet. The AI tool suggested sodium bromide as an alternative, which was a compound used in drugs in the early 20th century, but is now considered to have a large amount of toxic. Following this advice, the person bought an online sodium bromide and used it in his cooking for three months.

Due to no pregnancy history of mental or physical illness, the person began to feel hallucinations, paranoia and excessive thirst. On being hospitalized, he appeared confused and refused to take water for fear of being contaminated. Doctors found him suffering from bromide toxicity, a condition that is now almost unheard, but was sometimes common when determining bromide for anxiety, insomnia and other diseases. It also saw neurological symptoms, pimples like pimples and red spots on the skin, which are symptoms of bromism.

Also Read  Celebrate Bonds With Caratly's 18kt Gold Jewelery

The main objective of treatment in the hospital was to restore rehydration and balance of electrolyte. During three weeks, the person’s condition gradually improved, and the level of sodium and chloride was performed when he was discharged.

The company has clearly written this in its terms
The writers of the case study emphasized the increasing risk of misinformation related information from AI Tools. The report warns, “It is necessary to note that chat GPT and other AI systems can produce scientists, the results cannot be discussed seriously, and eventually promote the spread of misinformation.”

The Developer of the ChatGPT, OpenAI, clearly stated in terms of using it: “You should not consider the output received from our services as the only source of truth or factual information, or as an alternative to professional advice.” The conditions also clarify that it is not for diagnosis or treatment of service medical conditions.

Also Read  Prajwal, grandson of former PM Deve Gowda, will now be identified as prisoner number 15528

Experts say that AI tools can be beneficial for general information, but they should never replace professional counseling. As the use of AI is increasing, the responsibility of ensuring that its results are accurate, safe and clearly understood by people.

Join WhatsApp

Join Now