Once you have come to grips with if and how your students might use generative AI in their assessment tasks, the next thing to consider is how they should be referencing the use of these tools. Fortunately, our colleagues from the Library have some suggestions. Now is also a good time to remind students to be critical of the tools and information sources they are using. The CRAAP test may help them think through this. What is the CRAAP test? Read on to find out!

Referencing strategies for generative artificial intelligence

Generative artificial intelligence (generative AI) is impacting academia in many ways; from debate on whether these tools should be listed as co-authors on academic papers to whether and how these tools may be used by students in assessment tasks.

Where students are permitted to use generative AI in their university studies, as always, they should demonstrate academic integrity and acknowledge their sources. Academic libraries across Australia are currently suggesting two possible referencing strategies for these tools (such as ChatGPT): non-recoverable sources and software.

Strategy 1: Non-recoverable sources

ChatGPT responses cannot be retrieved, since the tool currently offers no way of pinpointing a response (e.g though a stable link) and re-entering a prompt results in a different response. If citing text generated by ChatGPT, follow the format of non-recoverable information sources, such as personal communication. Generally, this format requires an in-text citation but is not included in the reference list. Some advice also recommends writers keep a log of search prompts used to generate any response that is quoted or paraphrased.

Strategy 2: Software

Alternatively, some academic libraries recommend citing ChatGPT as software, in the same way that academic studies acknowledge NVivo or SPSS to make research methods transparent and reproducible. This strategy includes both in-text/footnote citation and an entry in the reference list.

Macquarie University Library has now added this advice to each style in our referencing guide; however, please check back regularly for updates as the approaches are standardised and confirmed. We also invite you to contact us with your use cases or alternate solutions.

ChatGPT and hidden literacies

As higher education grapples with ChatGPT and other generative AI tools (when and how should they be used in study? How does their use impact assurance of learning?), student development of critical digital and information literacy skills is more important than ever.

As we have always done, we are reminding students that information is easily available but varies in quality. To navigate the many voices and opinions online requires skills searching, selecting, and critiquing so we can actively manage the ocean of knowledge around us.

Similarly, it’s important to take a critical approach to tools that aggregate, synthesise and produce information in order to understand their affordances and weaknesses.

The CRAAP Test

The ever-popular CRAAP test (could it become CRAAP-E to include ethics?) may help frame a discussion with your students on these tools:

CCurrency

  • When was the tool created? How current is the information it uses?
  • Has anything happened since to affect the quality of the tool? (new data, other tools…)

RRelevance

  • Does the tool meet your needs? Are responses relevant to your topic? Note this may depend on the quality of your research and your skills in framing prompts.

AAccuracy

  • Are the responses accurate—how do you know? Do they include citations (=references) that allow you to determine where supporting information came from? Are the citations real?

AAuthority

  • Who are the tool creators and what is their expertise (the company mission, the resources available to them)? This may help you to understand their perspective and identify their biases.

PPurpose

  • Why was the tool created? Does the intrinsic design push the user to certain patterns of thought and activity?

EEthics

  • What data does the tool collect about the user? How is this used? Is it protected?
  • Does the tool attribute the sources/creators of the data it uses?
  • Is your use of the tool ethical? Does your use of the tool conflict with the goal of your task (e.g., if you are using it for an assessment, is use of this tool permitted)?

Questions?

For more information, contact Vanessa Todd (Manager, Digital Literacies Programs), Grai Calvey (Manager, Teaching and Research Services) or Heather Cooper (Manager, Teaching and Research Services).


Share your experience

Join the conversation: Thursday, 30th March 2023 from 13:00-14:00. Register free for a MQ Community Roundtable: Gen AI tools – implications for learning, teaching and assessment at Macquarie.

Banner image: Chris Stacey

Posted by Vanessa Todd

Leave a reply

Your email address will not be published. Required fields are marked *