ChatGPT Writes Incredibly Convincing Fake Medical Reports

Bad news for those who believe that artificial intelligence cannot lie and only works with verified information: ChatGPT not only makes things up, but is also very good at convincing people of the things it makes up.

After the pandemic, education was partially moved to online environments. During this period, many students resorted to the help of artificial intelligence to pass their assignments or written exams. Afterwards, they did not get the grades they expected. For ChatGPT, can write unreal things in a very realistic way. It seems like ChatGPT keeps making things up when it comes to medicine.

Ophthalmologist from the University of Cagliari in Italy Giuseppe Giannaccaretested ChatGPT for a study. In this test, ChatGPT was used to highlight a treatment. consisting of fake data It was noticeable that he created a list. Giannaccare said in its statement:GPT-4 created fake data on hundreds of patients in minutes. “It was a very surprising but also terrifying experience.” he said.

ChatGPT created fake data to promote a cure

The ophthalmologist and his team use ChatGPT as an advanced Python Combining it with a data analysis model, the researchers found that when they asked for clinical data to be generated for an eye condition, ChatGPT began to fit the data. Structure of the eye and treatment methods to artificial intelligence Explaining it in detail, the researchers then explained two procedures that do not differ much between them. Still, ChatGPT 4 produced enough fake data to highlight a procedure.

artificial intelligence benefits It comes up frequently. Yes, artificial intelligence can be very useful for some processes. When it comes to science or academic studies, artificial intelligence writing unreal things as real creates a serious problem. On the other hand, the only problem here is that artificial intelligence fabricating data that does not exist not.

If you have artificial intelligence write a poem or explain a topic, plagiarism detection programs can show you these parts. Or it can ensure that the incorrect part of the article is corrected. Work fake data When they arrive, if these data are not checked, they can pass into the literature. In this case, future studies can add this erroneous information. Moreover, if these studies are accepted as correct, they can cause errors and even fatal problems.

Research, JAMA OphthalmologyIt was published in .

Source :
https://techxplore.com/news/2023-11-chatgpt-persuasive-phony-medical.html


source site-38