Remember when search engines arrived and became tools that we all use? It is possible that Generative Artificial Intelligence will do the same. These tools do more than generate text – they also produce images (DALL-E2, Stable Diffusion), designs (StockIMG.AI), audio (Podcastle.AI) and can be used for video editing (Runwayml) or for reviewing scientific literature (Elicit.org). The space is expanding at a rapid speed with over 1000 AI tools now available.
While the scope for change is limited for assessments tasks already in train for 2023, there are things we can do support students to work with integrity, that do not require changes to the MQCMS.
The following ideas are a collection of tips and insights from across Macquarie. These are intended to be actionable, first steps towards our collective AI literacy that we can start doing now.
Test drive and tweak
Test-driving assessment task questions from your units using generative AI tools will provide a good sense of what to expect from such tools. Make any viable adjustments to the task and take the AI output into consideration when marking student responses, and when conducting moderation with marking teams.
You may also like to try this activity:
Assessment design reflection questions – an Explorer of AI risk due to AI tools!
Some tweaks that are applicable to many assessments:
- Make assessment more localised – Contextualise the task to refer to local (events, places, organisations, people) and using recent (after 2021) sources and examples.
- Make assessment tasks unit-specific – Students are explicitly asked to integrate sources, references, models, theories or material used in the unit in their assessment response.
- Include non-text elements – Include non-text resources within the assessment prompt or question. Images, infographics, diagrams, flowcharts, models, videos or other multi-media elements are more difficult to submit as part of a query to a generative AI tool. However, caution is needed because these will present accessibility barriers to some students.
- Create new questions – Avoid reusing case studies, questions or prompts from previous sessions and similarly, avoid asking generic questions about well-known events or entities as part of an assessment task.
- Connected assessment tasks – When using multi-phase assessment tasks (e.g. drafts, outlines, design briefs) where feedback is given between stages, allows educators to observe changes in student work. Check for consistency between phases, consistency in their justifications given compared to the product produced and compare student performance on the task to other tasks they have done in the unit.
Adjust the requirements for task responses
In addition to designing tasks with the first three items above, you can also request that students include these elements in their responses. That is, students make references to sources that are local, recent, unit specific (e.g. via Leganto), or to paywalled, refereed journal articles available via MQ library.
Requiring the inclusion of congruent/matching multiple media elements (images, video, designs or diagrams) may reduce the risk that a single AI tool can be used to produce a useful response.
Asking students to include live links in their reference list where applicable (e.g. a DOI URL) will also help markers check sources more quickly.
See it in action – You my also like to check out:
- MQ Examples: Some real life case studies of how others at MQ are tweaking their assessments for 2023.
- The Top 5: We also evaluated the top 5 assessment types used at MQ for their level of resilience to AI generation and provided some specific mitigation strategies.
Beyond the assessment task itself, there are a number of scaffolding actions that educators can take to support assessment with integrity.
Communicate with students
This is new territory for both educators and students, so don’t be afraid to propose to students that we should explore and learn together. It is suggested that educators:
Discuss academic integrity in the context of AI with students
Use of AI content generators can be discussed in class or via iLearn forums. This can be framed in the context of academic integrity, research ethics and critical thinking in the discipline.
- A set of starter discussion points and a lesson plan is also available to explore the use of AI in learning and assessment.
Co-create rules with students
Consider jointly creating guidelines with students for the use (or otherwise) of AI tools for the assessment tasks in the unit. You could use this lesson plan as a starting point.
Clarify the rules
Specify the conditions under which each assessment task is to be completed. Include whether generative AI tools are permitted in completing tasks, and if so, to what extent can AI content generators (and other tools, resources, sources) be used for each assessment task. The more explicit the guidance the better so be sure to also clarify what is not permitted.
- A resource designed by the Academic Literacies Unit will help you to communicate levels of acceptable use of AI tools in your assessment tasks.
- We have also prepared advice to help educators work through how to clarify for students any use and acknowledgement of AI tools.
Outline ethical use of AI tools
Use of AI tools in learning and assessment is guided by MQ academic integrity values. It is academic practice to properly cite content or ideas that are not your own; the same is true when using AI generated material. It is recommended that students:
Acknowledge AI sourced content
Ask students to cite where generative AI tools are used to generate ideas, direct quotes, or paraphrases.
- The library has prepared advice on how to cite some common types of AI sources.
Show their working
Consider using approaches that have students surface their thinking and the processes they used to arrive at their response. For example:
- Use a learning diary documenting how and why they arrived at their solution.
- Attach AI generated source materials as an appendix (i.e their AI ‘conversations’, prompts and AI generated output). Additional guidance is provided on how students and researchers can acknowledge and keep records of their use of different AI tools.
- Alternatively, assurance of learning may be achieved by using interactive oral assessment to check student understanding. Oral assessment can be in person or over a Zoom/Teams session. Zara Bending and colleagues from Macquarie Law school provide advice on how they are using Orals and Griffith University have advice on how to implement and scale the interactive oral assessment method.
Explore effective ways of using AI tools
Like all tools, generative AI tools have differing capabilities and limitations. Exploring these with students will help them become more AI literate. Some activities you can do with students include:
Critique an exemplar
A response produced by a generative AI tool can be used to frame a class critique (either the educator can prepare this in advance or students may submit questions to the AI tool during the class – but note any limitations as to the availability to students of suitable AI tools at that time). This critique may include fact checking (correctness, misconceptions), analysis of the writing style, the structure of the argument, the manner in which ideas or facts are supported (or otherwise, including checking for fake references), or if there are any contradictions within the output. Guide students in exploring the strengths and weaknesses of the AI generated responses and implications this has for the quality of their work, learning, and the development of their critical thinking and evaluative judgement.
- The library has prepared advice on how to develop AI literacy by using the CRAAP critique framework that encourages critical, ethical reflection on AI tool use.
Hunt for references
Perhaps continuing the previous activity, students (in small groups or individually), can be asked to find quality references to support or contest the generated content. Having students search via MQ library via Leganto reading lists or searching Library online databases for relevant journals or books. Highlight the need to seek reliable sources in order to support an argument, position or statement they make within their work. Remind students that they are ultimately responsible for the work the produce and submit.
Write a good prompt
Also termed “prompt engineering“, this is key to effective use of generative AI tools. Have students discuss the characteristics of effective prompts and experiment with ways to produce higher quality output from AI tools that are relevant to the task. Consider how different tools have different optimal prompt design. Consider co-creating a set of guidelines or advice for how effective research strategies and prompts can used for various tasks. A free online course to learn prompting is available.
Discuss social impacts and potential uses
Discuss the social impacts and potential uses of AI more broadly, including on employability. Teaching staff can lead discussion on how AI content generators are being used in professional and personal life. Students can also be asked to source or suggest examples. Examples from ABC NEWS (24 Jan 2023) and a 2022 McKinsey report include:
Art and design — Architectural design ideation, creating images and detail artwork, music, scripts and poetry.
Marketing — crafting targeted marketing, social media scubas blog posts and tweets, property listings and event descriptions for ticket sakes sites, and technical sales content (including text, images, and video); customer assistants (chat bots) trained for specific businesses, such as retail, insurance or banking.
Operations — generating task lists for efficient execution of a given activity or suggesting fixes for production problems.
IT — rapidly prototyping software code, reviewing code and writing supporting documentation.
Legal — summarising legal information, drafting contracts, drafting and reviewing annual reports.
R&D — summarising scientific literature, accelerating drug discovery through better understanding of diseases and creating novel protein sequences.
Share your experience
We welcome further suggestions for assessment changes or ways of working with students on the use of AI tools in the comments below. You can also contribute your ideas by emailing firstname.lastname@example.org.
Join the conversation: Thursday, 30th March 2023 from 13:00-14:00. Register free for a MQ Community Roundtable: Gen AI tools – implications for learning, teaching and assessment at Macquarie.
Found an ‘AI generated’ academic integrity breach? See this advice on how to gather evidence and report it to MQ for investigation.
Acknowledgements: Banner image: Stable Diffusion generated artwork “Robot repairing a book” (28 Feb 2023). M. Hillier. CC0 1.0 Universal Public Domain.