Notebookcheck Logo

Ingenious: ChatGPT creates false study on false claim

Which comes first: misinformation or false evidence? (Image source: pixabay/Pexels)
Which comes first: misinformation or false evidence? (Image source: pixabay/Pexels)
AI can hallucinate and use false claims to substantiate arguments. A new quality are apparently scientific studies that conclusively prove the desired result at first glance. As if humans weren't already quite good at this themselves.

With the help of studies, provided they are well prepared, conscientiously conducted and evaluated, the effects of surgical techniques or the medication used can be investigated, particularly in medicine.

According to a paper in JAMA Ophthalmology, this cumbersome process can be shortened. Thanks to GPT-4, the version currently used by ChatGPT, and additional software for statistical data analysis, it only took a few clicks to create a data set.

And not just any data set. The information to be generated was intended to demonstrate the benefits of a particular treatment technique over an alternative method, and that's exactly what it ended up doing. Incidentally, the data that actually exists on the procedure in question does not provide an advantage for any particular side.

It would therefore be possible to get the desired result at the click of a mouse without too much effort, instead of spending months or years researching and ultimately getting the "wrong" result. This sounds like a very convincing scam in which AI provides conclusive evidence in addition to disinformation.

After all, the researchers also have positive things to report: The data did not stand up to close scrutiny. For example, the first names and the gender given often did not match. In addition, certain data did not stand up to statistical analysis. For example, far too many age information ended in "7" or "8". However, the actual test results are said to have been consistent.

Not new, just faster

So it seems that there are ways and means to get to the bottom of such methods. And of course it is also clear that studies have always been incorrectly structured, evaluated and embellished in order to achieve a desired result.

Artificial intelligence has only accelerated this a little. Ultimately, however, it is just a tool that can be used to tackle very, very complex tasks.

An example: examining huge data sets of all possible studies for inconsistency and statistical anomalies.

static version load dynamic
Loading Comments
Comment on this article
Please share our article, every link counts!
Mario Petzold, 2023-11-23 (Update: 2023-11-23)