This FAQ page provides information about generative AI in higher education, including advantages, drawbacks, privacy and ethical issues. Primarily written to provide key information to educators and administrators to gain a deeper understanding of this rapidly evolving space.
The Academic integrity Policy was amended on 25 February 2023 and provides the university with immediate coverage for the use of AI tools such as ChatGPT. These technologies are included what constitutes academic misconduct.
- Clause 6(2)(g)(ii) states that “using work (e.g. assignment, essay, exam paper, research paper, creative project, data) generated by an artificial intelligence (AI) tool in an assessment unless expressly permitted to do so and with proper acknowledgement” is considered plagiarism.
While this means that unless permitted to do so, students are not allowed to use generative AI in an assessment, at UOW we aim to equip students with the tools needed for an ethical and responsible use of AI tools. This aligns with our Strategic Goal 1 – Empowering Students for their Future. Therefore, academics are encouraged to discuss the use of these technologies with their students and allow their appropriate use in assessments. How this can be done will vary depending on several factors (such as the discipline, the assessment type, etc.). In all cases, it is important for academics to define what is an appropriate use of generative AI and for students to properly acknowledge their use of AI in assessments.
AI tools are quickly developing, and a further review of the Academic Integrity Policy and Academic Misconduct Procedures is in progress – UOW staff will be invited to comment on any changes.
At UOW, we are strongly recommending that this should not be done. There are ethical concerns and significant risks around privacy and the provision of student data (without consent) to a third party that need to be evaluated from a legal perspective. The reliability of AI text detectors in producing accurate results remains dubious, and their effectiveness in using their outcome in academic misconduct cases is still uncertain. It is also not yet determined if there will eventually be an AI text detector that achieves a satisfactory level of precision for use in educational settings. Therefore, the accuracy of the results produced is questionable and their efficacy in bringing a case of academic misconduct is not yet clear.
No. Turnitin launched a new detection feature in April 2023 that could purportedly be used to assist in detecting the incorporation of AI-generated text in student submission. The lack of time for adequate testing and minimum notice given by Turnitin were some of the reasons why UOW, along with multiple other institutions, have chosen to opt out of the initial launch of this first version. The absence of such testing has meant that we can not assess the efficacy of this tool or the accuracy of the results produced. This creates a risk of erroneous results, potentially leading to students being falsely accused of academic misconduct. We understand that some academics may be able to access this feature in an individual web-based account; however, the University recommends that it is not used at this time for the reasons listed above and ethical and legal issues regarding sharing student work with third parties without consent.
At UOW, our focus is not on surveillance. We understand that with the speed generative AI is evolving, detection tools are unlikely to be 100% effective. Academic Integrity is a core value at UOW and academics are expected to have open and honest conversations with students about the importance of acting with honesty, fairness, trust and responsibility. These values are expected of the UOW academic community and are also reflected in the ethical code of conduct of most, if not all, professions and organisations students will engage in after graduating. In order to better prepare students for their future, at UOW the focus is on the opportunities generative AI presents and how we can be more creative and more authentic in designing assessment tasks. Professor Theo Farrell, Deputy Vice-Chancellor (Academic & Student Life) explains: “AI will increasingly become endemic in our work and everyday lives. We shouldn’t be scared of it. Instead, we must try to understand and embrace it while constantly re-evaluating the moral and pragmatic implications of this revolutionary technology. “
One of the current concerns is that AI tools may be able to provide passing level answers to some of the assessment tasks we are assigning our students. With detection tools unlikely to be 100% effective, the question is how we can assure that students are achieving the learning outcomes if they may be using an artificial intelligence tool to generate their assessments. The good news is that generative AI tools also have the potential to revolutionise the way assessments are designed and administered. We now have this opportunity to make the most of emerging technologies to assist us in designing assessment tasks for job-ready graduates.
We need to make sure that the assessments we design and implement are quality assessment tasks (see UOW Assessment & Feedback Principles). This has not changed with the emergence of Artificial Intelligence. Assessments still need to be aligned with your learning outcomes and it is important that we avoid setting more difficult assessments due to a worry that students may be using generative AI in their work.
In this video, Associate Professor Sarah K. Howard reflects on AI and education and what it means for the ways we as educators should be thinking about and designing for learning, teaching, and assessment.
Considering the pace AI is evolving, it is critical to recognise that it will be very difficult to design assessments that are completely AI-proof. Although we cannot “design out” cheating entirely, designing quality assessment goes hand-in-hand with developing our students’ digital literacy capabilities and facilitating their academic integrity education (Refer to The role of quality assessment design in strengthening academic integrity).
Hear from UOW educators Matalena Tofa, Armin Alimardani and Jonathon Mackay as they share how they have incorporated GenAI into their assessments. Ella Young shares her perspective on Jonathon's assessment incorporating GenAI from a student's perspective.
The most critical thing is to have an open and honest conversation with your students about ChatGPT and other generative AI. It is quite likely that your students will have already heard about it, some may have even tried it. There is a lot of misinformation around it, though, and we play a key role in helping our student build their digital literacy and use artificial intelligence in an appropriate and ethical manner.
We would suggest that after you have explored how generative AI tools work and experimented with them (see AI in Education: Publicly available GenAI tools) that you talk about it with your students. Some main points to discuss would be that:
- Responses from generative AI can mirror existing biases present in the training data or in the information they access.
- They are trained to sound like human writing, but they are just using algorithms to process and generate content. They do statistical analysis on patterns and predictions on a set of words to generate what sounds like plausible sentences.
- They have been trained on so much data that they are strong at providing clear and simple explanations to difficult concepts.
- They may make things up. It is trained to mimic human writing, but it does not understand or actually know anything. It may give you the wrong answer for a question, but it will sound very convincing. It is important to take its response and evaluate it critically.
You could demonstrate ChatGPT (or another generative AI tool) in class, live. This could be done as a teaching activity as a way to begin the discussion of the points above.
The UOW Library provides a resource for students on Using Generative AI and ChatGPT well that provides a scaffolded approach to support students in their explorations with Generative AI. Starting with understanding the basics, the resource supports students through ethical use cases, prompt generation, how to reference it appropriately, and more.
Students also need to be aware of the Academic integrity Policy, the UOW AI & ChatGPT page and information about referencing. While formal guidelines on referencing the use of AI tools are still in development, in the interim, students are encouraged to refer to the Software and Apps section of their referencing guide and to check with their lecturer to confirm their position on using AI tools.
UOW’s Learning and Teaching Hub (L&T Hub) is regularly updated to house the latest university supported information related to the Impact of Artificial Intelligence (AI) on learning and teaching. The web presence provides up-to-date information about key resources, events, and support on the topic.
Hear from UOW students and educators sharing their perspectives and practices incorporating GenAI in the L&T Hub Showcase.
Also available, is a Key Resources page with links to news articles, scholarly papers, ideas and considerations for incorporating GenAI into teaching.
You can request a conversation with a Learning and Teaching specialist to discuss assessment design and implementation and/or teaching practices. This can be done:
By phone: (02) 4221 4004 | 8:30 AM to 5:00 PM Monday to Friday (AEST)
By email: ltc-central@uow.edu.au
Please let us know and we can add it here.
By phone: (02) 4221 4004 | 8:30 AM to 5:00 PM Monday to Friday (AEST)
By email: ltc-central@uow.edu.au