Cowles Library | Closed |
---|
Cowles Library | Closed |
---|---|
Archives | Closed |
Midnight Hall | Open 24 hours |
Cowles Café | – |
In summary: ChatGPT makes up information and citations when it doesn't know the answer to a question.
In the Reuters article below, a lawyer used ChatGPT to write a brief. ChatGPT made up six different cases for this brief. There are potential professional and ethical rules that were violated in this case and the lawyer could be subject to disciplinary action.
In summary: ChatGPT makes up citations.
In the Curēus article below, two researchers used ChatGPT to write a case report. They specifically noted that ChatGPT made up citations that did not exist and how problematic that could be.
In summary: ChatGPT does not always summarize information correctly or it misstates or distorts information
Ex 1: In the Nature article by van Dis, E. A. M below, the researchers experimented with asking ChatGPT a question from their research area of interest. They also asked ChatGPT to summarize an article they had written and they found that ChatGPT listed incorrect and incomplete information and interpreted the article inappropriately.
"For example, when we asked ‘how many patients with depression experience relapse after treatment?’, it generated an overly general text arguing that treatment effects are typically long-lasting. However, numerous high-quality studies show that treatment effects wane and that the risk of relapse ranges from 29% to 51% in the first year after treatment completion 2–4. Repeating the same query generated a more detailed and accurate answer..." (pg 224)
Ex 2: In the Windows Central article, there has been a large increase in AI-generated books especially on Amazon. Additionally, AI-generated books are then used as source material to generate more AI-generated books and content. The implications for misinformation and inaccuracy are going to be challenging.
“... an ever-growing minefield of AI-written books continues to clutter the service. There are concerns that, with time, the contents of these books will be filtered back through AI as training samples which will then be used to churn out more AI-written books with AI-written reviews that risk further skewing the algorithms.”
Ex 3: In the Nature article by Conroy, G. below, researchers used ChatGPT to help write an article using a US CDC data set "from more than 250,000 people about their diabetes status, fruit and vegetable consumption, and physical activity." The article focused on the topic of how physical activity can affect diabetes. The ChatGPT article found that the risk of diabetes is lowered by eating more fruits and vegetables and exercising. The ChatGPT article also used a frequently used expression stating that this study “addresses a gap in the literature.” While this expression is frequently used in many studies, it does not apply at all to this specific topic as the findings in the article are not new or original and are in fact quite well known.
Articles