ChatGPT Risks: AI chatbot has given wrong answer to 1 out of 10 questions regarding breast cancer. Doctors have raised concerns about the use of ChatGPT in the healthcare sector.
Chat GPT Now even doctors are scared due to its effect. A study conducted in America revealed that Open AI The chatbot developed by gives false information. The doctors technology has been declared extremely dangerous. Actually, ChatGPT was asked some questions about breast cancer, some of which were wrong. That’s why he has given a warning on the use of artificial intelligence based chatbots in the medical field. The fear of ChatGPT is continuously increasing, and now doctors are also considering it dangerous.
Researchers from America’s University of Maryland School of Medicine asked ChatGPT 25 questions. All these questions were based on giving advice for breast cancer screening. Researchers asked each question at three different times. After this the result was checked by three radiologists who have received training in mammography.
88% answers correct
It was revealed in the result that ChatGPT gave correct 88 percent answers which were also very easy to understand. But some of the answers were not only wrong but also imaginary. That’s why he has expressed concern over the use of chatbots. Researchers have asked people to be careful as this is a new technology. Apart from this, for any health-related problem, it is advised to take the opinion of doctors only.
resorting to lies to prove the claim is true
Talking about an example, this answer was based on an old information. According to the report of DailyMail, ChatGPT replied and advised to get a mammogram done after four to six weeks of taking the Kovid-19 vaccine. However, this advice has been changed only last year and now women do not have to wait for it.
ChatGPT has not given the same answers to questions like the risk of getting breast cancer and where to get mammogram done. It was found in the study that there is a difference in the answers given by the chatbot. According to Dr Paul involved in the study, ChatGPT sometimes uses fake journal articles to prove its claims.
trust an organization
It has been claimed in the study that Google search can answer these questions in a more concrete way. ChatGPT has trusted only one organization – the American Cancer Society – to prepare the answer. It did not respond based on recommendations from the Centers for Disease Control and Prevention or the US Preventive Services Task Force.