Understanding the New Landscape

What is Generative AI? By Oracle

Generative Artificial Intelligence

The advent and rapid uptake of generative artificial intelligence (GenAI) tools with increasingly multimodal capacities has radically shifted the educational landscape, presenting both opportunities and challenges for fostering student learning and maintaining academic integrity. Australian schools find themselves at an inflection point, seeking to harness the enhancement potentialities of GenAI whilst ensuring authentic learning outcomes. This section overviews key considerations for school governance, education, detection of unapproved use of GenAI, and learning design strategies for fostering academic integrity and strengthening assessment validity.

Governance

Each School is encouraged to consult with relevant curriculum authorities and to review its assessment and learning and teaching policies to support students in becoming critical and ethical users of GAI and other technologies. The Glennie School’s Academic Integrity and Assessment Policy is provided as an example of one approach to supporting student success in learning and assessment.
This policy complies with the Queensland Curriculum and Assessment Authority’s assessment policies and guidance on AI. The government has also developed an Australian Framework which offers valuable guidelines. The Glennie School has taken a proactive, student-centered approach that focuses on developing students’ skills as learners so they can make informed, critical, and ethical choices, while also making clear to students the impact of engaging in academic misconduct. Explicit statements about the appropriate use of learning tools, including AI, are included in each summative assessment item.
ChatGPT is a powerful language model developed by OpenAI that has the ability to understand and generate human-like text. But it is not the only AI tool students may be using – many tools, such as Grammarly or Copilot, go beyond checking simple spelling and grammar and will now make suggestions or generate entirely new content. Here are some examples. The technology is rapidly evolving, with new models being able to produce multi-modal outputs such a video, code or even novel artefacts like websites. Examples of tools such as these include Sora, Gemini, Bard and Adobe.
Generative artificial intelligence has the potential to revolutionise our teaching and learning and is already making a significant impact in education. This section outlines a management approach for secondary schools relating to GenAI and Academic Integrity:
  • Governance
  • Education
  • Detection
  • Prevention
This section also provides considerations for assessment redesign alongside opportunities to enhance student learning through the development of their general capabilities and a considered integration of such technologies. You should consider this in light of your school pedagogies, policies and procedures.

Education

Students in Years 7-9 start the year with a presentation from the Director of Learning and Innovation and teaching staff which addresses what it is to be a learner at Glennie, including the practice of academic integrity. This information may be useful as an example which can be adapted to your own context.
All students in Years 10 – 12 are required to complete the QCAA’s Academic Integrity course annually. This is reinforced by classroom teachers as students complete both classwork and summative assessments.  Teachers are also required to complete the Teacher Academic Integrity courses and refresh their training as directed.
Glennie has taken the General Capabilities approach as a basis for our Learner Growth Framework. We work in partnership with parents and carers to ensure there is a shared, consistent, intercultural understanding about developing successful learners using appropriate supports (e.g. tutoring, homework, subscription service etc). 
  • In Year 7, the focus is on developing students’ personal and social capabilities and understanding the literacy and numeracy expectations of secondary schooling.
  • In Years 8 and 9, we build upon these foundations with a strong focus on ethical understanding about working with others and using learning tools appropriately by developing their digital literacies.
  • Finally, in Years 10-12, having established our students as confident and capable learners, our focus shifts to ensuring they have the critical and creative thinking skills necessary for success in the senior years of their secondary schooling.

Detection

The unethical use of GenAI currently presents a new challenge in ensuring the academic integrity of students’ work. There are a range of detection tools that can help teachers ascertain if a student is likely to have inappropriately used GenAI in creating their response, and thus engaged in academic misconduct. These detection tools are just one part of a broader detection strategy designed to support students in the appropriate use of GenAI, and to apply consequences for academic misconduct. 
Although it is difficult to detect the use of GAI technologies such as ChatGPT, there are some important signs to look for when marking student assessments. These are outlined in detail in Managing Academic Misconduct.
These include:
  • Teacher knowledge of student’s past performance and ability
  • Difference in quality to formative assessment or class work
  • Difference in quality and formatting to previous submissions
  • Distinctive and formulaic syntax, grammar and voice
  • Contextual inappropriateness to topic and discipline
  • False or outdated information and references
  • Lack of version history; evidence of large copy and pasted passages
  • Students unable to explain or replicate the creation of multi-media outputs such as audio, images, video, slideshows or websites
There are a wide range of detection tools available, which have varying degrees of accuracy. Turnitin and other paid services are one means of detecting the possible use of AI in written text. There are also several open-source artificial intelligence detectors that have been created in response to generative technologies such as ChatGPT. There are however concerns around legal and ethical obligations regarding student data and privacy as well as the accuracy of the report. We do not encourage the use of open-source detection tools at this stage.

Prevention through learning design and strengthening assessment validity

  • Ultimately, the most pragmatic and cost-effective way to mitigate the risks of academic misconduct is to design a programmatic approach to assessment. This involves mapping critical tasks through the learning program and resourcing the design and delivery of those specific assessment tasks appropriately to ensure learning outcomes and objectives are met. In this regard, a student’s assessment can be considered valid when it represents their actual capabilities. In this regard, the critical question for teachers becomes  – what do we want to our students to learn in the age of GenAI, and how can we confirm that they have learnt it? 
  •  

As GenAI can now create outcomes that resemble traditional assessment artefacts such as essays, blogs and videos, the focus of assessment needs to shift from the final product, to the process of production. This means that tasks need to have regular check points, and teachers need to observe how students’ ideas and responses are developed iteratively over time. Students need to be able to explain and evidence how their assessment response has been authentically developed. The QCAA provide a helpful list of strategies for ensuring authenticity. 

This broader topic is discussed in detail in the Teaching for Academic Integrity section, however below are some examples of assessment types that can mitigate the risks of academic misconduct:

 

  • Continuing assessment (i.e. smaller linked assessments that build upon each other)
  • Authentic assessment tasks. These reflect real-world tasks relevant to specific disciplines and industries. Some examples include:
    • Problem based, or hands on learning and assessment (i.e. performance tasks)
    • Projects and portfolios (i.e. collection of work that showcases achievements over time)
    • Personalised case studies or simulations (i.e. hyper-realistic and contextual tasks)
    • Group tasks (i.e. requiring collaboration, interpersonal communication and critical thinking skills)
    • Dialogic assessments (i.e. requiring discussion between the teacher and the student)
In circumstances where students’ knowledge and or ability to respond independently is the main learning outcome being assessed, closed assessments such as invigilated in-class quizzes and exams or the use of specialised assessment software may be the most appropriate way to ensure that academic integrity is maintained. 

Levels of AI use: A UniSQ Example

  • Different assessments may warrant different approaches to the levels of AI use permitted. Depending on the learning outcomes being assessed, this can range from no AI to full AI integration. Rubric re-design may need to be considered in order to evaluating the knowledge and skills demonstrated in the process of production, rather than the final product. 

References

Dawson, P., Bearman, M., Dollinger, M., & Boud, D. (2024). Validity matters more than cheating. Assessment & Evaluation in Higher Education, 1–12. https://doi.org/10.1080/02602938.2024.2386662