Skip to Main Content
Cowles Library Closed
Cowles Library Closed
Archives Closed
Midnight Hall Open 24 hours
Cowles Café

ChatGPT/ AI & Library Research: Examples

AI: research, information literacy, and citations

  • Research & AI

In summary ChatGPT makes up information and citations when it doesn't know the answer to a question. 

In the Reuters article below, a lawyer used ChatGPT to write a brief.  ChatGPT made up six different cases for this brief. There are potential professional and ethical rules that were violated in this case and the lawyer could be subject to disciplinary action.  


  • Citations & AI

In summary ChatGPT makes up citations.

In the Curēus article below, two researchers used ChatGPT to write a case report. They specifically noted that ChatGPT made up citations that did not exist and how problematic that could be.


  • Searching for, finding, evaluating, and using the best information to answer a question (e.g. information literacy)

In summary: ChatGPT does not always summarize information correctly or it misstates or distorts information 

Ex 1: In the Nature article by van Dis, E. A. M below, the researchers experimented with asking ChatGPT a question from their research area of interest. They also asked ChatGPT to summarize an article they had written and they found that ChatGPT listed incorrect and incomplete information and interpreted the article inappropriately.

"For example, when we asked ‘how many patients with depression experience relapse after treatment?’, it generated an overly general text arguing that treatment effects are typically long-lasting. However, numerous high-quality studies show that treatment effects wane and that the risk of relapse ranges from 29% to 51% in the first year after treatment completion 2–4. Repeating the same query generated a more detailed and accurate answer..." (pg 224)

Ex 2: In the Windows Central article, there has been a large increase in AI-generated books especially on Amazon. Additionally, AI-generated books are then used as source material to generate more AI-generated books and content. The implications for misinformation and inaccuracy are going to be challenging.  

 “... an ever-growing minefield of AI-written books continues to clutter the service. There are concerns that, with time, the contents of these books will be filtered back through AI as training samples which will then be used to churn out more AI-written books with AI-written reviews that risk further skewing the algorithms.”  

Ex 3: In the Nature article by Conroy, G. below, researchers used ChatGPT to help write an article using a US CDC data set "from more than 250,000 people about their diabetes status, fruit and vegetable consumption, and physical activity."  The article focused on the topic of how physical activity can affect diabetes.  The ChatGPT article found that the risk of diabetes is lowered by eating more fruits and vegetables and exercising.  The ChatGPT article also used a frequently used expression stating that this study “addresses a gap in the literature.”  While this expression is frequently used in many studies, it does not apply at all to this specific topic as the findings in the article are not new or original and are in fact quite well known. 

 

Articles

  1. Manohar N, Prasad SS. Use of ChatGPT in Academic Publishing: A Rare Case of Seronegative Systemic Lupus Erythematosus in a Patient With HIV Infection. Curēus. 2023;15(2):e34616-. doi:10.7759/cureus.34616
  2. Sloan, K. (2023, May 30)  A lawyer used ChatGPT to cite bogus cases. What are the ethics?  Reuters. 
  3. van Dis, E. A. M., Bollen, J., Zuidema, W., van Rooij, R., & Bockting, C. L. (2023). ChatGPT: five priorities for research. Nature, 614(7947), 224–226.
  4. Martin, C. (2023, May 12). ChatGPT-written books with ChatGPT-written fake reviews are flooding Amazon. Windows Central.
  5. Conroy, G. (2023). Scientists used ChatGPT to generate an entire paper from scratch—But is it any good? Nature