monster with heartsFebruary brings thoughts of love…and monsters. That's right. Many of our classic love stories have a monster theme (though an argument could be made that the monster stories have a love theme). Monstrous desire is showcased in the classic vampire masterpiece Dracula by Bram Stoker. This dark tale tackles themes of forbidden desire and the power of love to resist darkness and temptation. The Strange Case of Dr. Jekyll and Mr. Hyde showcases the creepy, obsessive love of a physician turned monster. Robert Louis Stevenson tells the story of Mr. Hyde's infatuation with a young woman named Ivy, which turns possessive and controlling. And finally, you can't forget Mary Shelley's Frankenstein, a story of embracing otherness. The monster's demand for a mate to meet his need for love and connection portrays the struggle of an outcast seeking understanding and companionship.

I'm going to make a giant leap and say that using generative AI tools is a type of monster love. Unless you've been stranded on a deserted tropical island (and I admit that at this time of year, I'm a little envious if you have), you're familiar with the excitement around the applications of generative AI tools like ChatGPT from OpenAI. In the library, we've also taken an interest in generative AI. However, our interest lies in the implications of using generative AI as a research assistant—specifically, the citations generated by generative AI tools.

Do you remember the first time you entered a query into ChatGPT, and it responded? And how about that little thrill and rush you experienced when you commented on its response, and it commented back? It was a real-life manifestation of how Dr. Frankenstein must have felt when his monster came to life. But like Frankenstein's monster, generative AI has much to learn. And just like the unforeseen consequences of monster creation, the citations produced by generative AI are unexpected and undesirable.

I know what you're thinking. But why? Isn't using generative AI just like searching in the library? And the answer to that question, my friends, is no, it's not. A library database is a collection of resources (academic journals, newspapers, reports, magazines, ebooks, etc.) that have actually been published and are stored and indexed for searching. Unlike the library's databases, which are information retrieval systems, generative AI tools, like ChatGPT, are Large Language Models meant to predict text. They are trained on a large dataset of internet sources and analyze text information (large language) in a string of words to predict what will likely come next. So, when we look at ChatGPT (text prediction) versus the library's databases (information retrieval), we're looking at two very different systems designed to do two very different things. And when you ask ChatGPT to step outside its lane and do research, it errors because that's not what it was designed to do. But, because it's a machine designed always to try to answer, it creates "hallucinations" (or fake results, though I prefer fabrications) to fulfill what you're asking it to do even though it can't actually do the thing you asked it to. This is prevalent in its citation generation.

robot using a computerAccording to an article published by Walters and Wilder (2023) in Scientific Reports, of 636 references ChatGTP-3.5 (the free version) was asked to generate for literature reviews on 42 multidisciplinary subjects, a whopping 55% of them were fabricated. This is just one study, but the numbers and anecdotal accounts across the literature tell us that relying on generative AI to produce a list of sources for your research will cause more problems than help.

Out of curiosity, I gave ChatGPT a very basic prompt:

Find five academic journal articles on the use of monsters in classic literature and display them in APA format

It generated the following list:

  1. Kucich, J. (1996). "Monsters of Fiction: The Victorian Gothic and English Ideology." Studies in English Literature, 1500-1900, 36(2), 333-350.
  2. Cohen, J. (1996). "Monster Culture (Seven Theses)." Journal of Popular Culture, 30(2), 13-32.
  3. Heller, T. C. (2001). "Frankenstein and the Miltonic Creation of Evil." Renascence: Essays on Values in Literature, 53(4), 239-252.
  4. Botting, F. (2006). "Gothic Returns in Dracula." Textual Practice, 20(2), 215-238.
  5. West, A. (2017). "Archetypes of Monstrousness in Victorian Gothic Literature." Victorian Literature and Culture, 45(1), 89-106.

I researched each citation and found that not one of the citations it provided was genuine. Two were fabricated entirely, and the rest were compilations of titles, authors, and book chapters. In an attempt to be helpful, ChatGPT put all the citations in the correct APA format for a journal. However, none of these citations were actual journal articles. Many years ago, when I worked at my college newspaper, our faculty advisor would tell us, "If your mother tells you she loves you, check it out." Similar advice should be adopted when using citations produced by generative AI: Check each citation carefully for fabrications and inaccuracies.

Better yet, use the library databases for academic research. Our databases have been selected with our curriculum in mind. While not every resource is appropriate for every purpose (think peer-viewed sources versus newspaper articles), they are real and can be verified. Moreover, the resources in our databases include open-access materials (journals found on the internet) and subscription-based resources, which means they are more current and extensive than the generative AI resources.

So, as a research assistant, ChatGPT and other generative AI tools are far from perfection (and since they aren't an information retrieval system, this is not unexpected). But this doesn't mean there aren't tools available to help you with your research.

  1. Ask Us. The university library has a staff of librarians available to assist you using its databases. The librarians will recommend databases and provide search strings to help you find the credible (and real) information you need for your research. And, if you think that asking the librarians for guidance is too scary (or plain isn't for you), one of the top pieces of feedback we hear is that 'asking the librarian for help was way easier than trying to do it myself!'
  2. Toolkits. The database toolkits are a collection of guides that provide tips and tutorials on using library databases.
  3. Escape from PS648! This virtual escape room tackles the monstrous questions associated with APA citations with (what else?) a zombie apocalypse.

Escape from PS648 Part 2We also want to give you a little Valentine's gift this month. In keeping with our monster theme, we have created a follow-up virtual escape room to Escape from PS648. Escape from PS648 Part II (clever title, right?) continues the adventure by addressing some common APA citation mistakes. We polled the faculty to get input on what problems they see over and over and added them to our zombie story. Enter a world where a mutated strain of Cordyceps fungus turns a population into zombies, and help find the one researcher with the cure. The catch? The researcher has left all their clues in APA citations. Now that's a horror story!

Resources

McMichael, J. (2023, January 20). Artificial intelligence and the research paper: A librarian's perspective. SMU Libraries.

Park, S. and Rozear, H. (2023, March 9). ChatGPT and fake citations. Duke University Libraries.

Walters, W. H., & Wilder, E. I. (2023). Fabrication and errors in the bibliographic citations generated by ChatGPT. Scientific Reports. 13(1), 1–8. https://doi.org/10.1038/s41598-023-41032-5

 

Nicole TassinariNicole Tassinari is an associate university librarian and oversees content development. She's the proud mom of three almost-grown children who love to tell her to "just Google it!"