Legal implications of ChatGPT in Australian Schools

Since launching in November 2022, the popular artificial intelligence tool Chat Generative Pre-Trained Transformer (“ChatGPT”) has facilitated a technological revolution with far reaching consequences. For Australian schools, this brings to the fore an abundance of legal issues for consideration.

This article provides an introduction to the legal challenges arising from ChatGPT and offers several measures to manage and stay abreast of this ever-changing technological landscape.


How does ChatGPT work?

ChatGPT is a language processing tool, accessible to the public through web browsers. It has the capability to compose emails, essays, code, scripts and a variety of other forms of advanced text.


How have schools responded?

As a new technology, educational institutions are in the initial phases of managing and responding to ChatGPT. The various Departments of Education in Western Australia, New South Wales, Queensland and Tasmanian have responded swiftly, banning the use of ChatGPT in primary and secondary public schools.

At a tertiary level, Australian universities have had to rapidly restructure their respective curriculums; such measures include increased face-to-face supervision and pen-and-paper examinations, which afford students fewer opportunities to plagiarise work with ChatGPT.


What legal issues arise for schools? 

Intellectual Property and Ownership Issues

Just as the advent of computers and other educational technology has become a regular practice for teachers, why not utilise ChatGPT to boost productivity and thereby afford more time to focus on what matters most – the children?

One potential hindrance in this regard, is legal ownership of the content created. Is the teacher and/or educational institution the proprietor, or is it ChatGPT?

Recently, the Australia Federal Court in Thaler v Commissioner of Patents [2021]1, found that AI could not be credited as being an ‘inventor’ of a patent under the Patents Act,2 as it is not a natural person, and therefore incapable of owning intellectual property.

Whilst the Thaler case provides a useful illustration of this constantly evolving technological space, it leaves many questions unanswered.

As a starting point, sub-clause 3(a) of OpenAI’s (the creator of ChatGPT) Terms of Use Policy purports:

As between the parties and to the extent permitted by applicable law, you own all Input. Subject to your compliance with these Terms, OpenAI hereby assigns to you all its right, title and interest in and to Output. This means you can use Content for any purpose, including commercial purposes such as sale or publication, if you comply with these Terms. 

However, there is a caveat to this proposition.

While OpenAI does not use data submitted by customers to their API3 to develop or improve Services, the same cannot be said for ChatGPT. When users engage with ChatGPT, OpenAI stipulates at subclause 3(c) of its Terms of Use Policy that it may utilise the content “to help develop and improve our Services”.

What does this mean for educators? By way of example, Teacher A uses the ChatGPT prompt tool to create a lesson plan based on a pre-existing set of learning objectives. Ownership of both the learning objectives and lesson plan generated remains and is assigned respectively to the user, here, the teacher.

But, the data from both the input (learning objectives) and the output (lesson plan), per clause 3(c) may be used to improve the ChatGPT model. So, if an unrelated Teacher B subsequently requests either a lesson plan or learning objectives, the product generated by ChatGPT to satisfy this request may draw from the information entered and generated by Teacher A.



The provenance of the data used to train ChatGPT is problematic.  The data was acquired from a wide range of sources, including personal information obtained without consent and from copyrighted texts.

Clause 9(d) of the Terms of Use provide a mechanism for resolution of alleged copyright infringement, which may result in the “[deletion] or [disabling of] content alleged to be infringing”. In the interim however, what are the flow on impacts as they relate to the legality of accessing this information within an educational context?

The Privacy Act 1988 (Cth)

Educational institutions must abide by the Australian Privacy Principles set out in the Privacy Act 1988 (Cth), which establish the minimum standards for the collection, use, access, and disclosure of personal information (for further information regarding the Privacy Principles, see our Article Disclosing Private Information Regarding Schools).

The intersection of the Privacy Principles and AI is another area for close consideration.

Primarily, the Privacy Policy of OpenAI provides for the collection of the following personal information:

a. account information and user content; 

b. the contents of any messages [users] send (referred to as Communication Information); 

c. Internet Protocol address, browser type and settings, and how users interact with the website; and 

d. the types of content that users view or engage with, the features users utilise and the actions users take, as well as time zone, country, the dates and times of access, user agent and version, type of computer or mobile device, computer connection, IP address and ‘the like’. 

Let us consider, then, a hypothetical situation where a teacher asks ChatGPT to provide suggestions regarding a student’s individual learning plan. The teacher includes specific information about the student, including their learning difficulties and sensitive personal background. 

In circumstances where ChatGPT can extract the contents of any message, this would constitute a breach of Privacy Principle 11.1(b), which requires schools to take such steps as are reasonable in the circumstances to protect personal information from unauthorised access, modification or disclosure. 


How can schools manage and mitigate legal risks arising from use of ChatGPT 

    • For schools that are utilizing ChatGPT to create educational content, it is important they are aware that content created by ChatGPT may not be protected by copyright. While the status of copyright protection for AI generated content is a developing area of Australian law, schools should implement appropriate policies to provide clarity for its educators. 
    • If permitting teachers to utilise ChatGPT, teachers should ensure material is cross checked for accuracy prior to use. Furthermore, when using the service to draft policies, procedures and marketing material, the information should be appropriately reviewed and verified. 
    • To maintain academic outcomes and integrity in the face of these challenges and in consideration of the changing technological landscape, educational institutions may wish to consider which detection tools are most applicable and necessary for detecting use of ChatGPT and similar programmes. 
    • In creating new policies and approaches to ChatGPT and similar technologies, schools should consider strategies to integrate these technological tools in the classroom. Since they will remain a fixture within the technological landscape, empowering students to effectively and ethically use these new tools is a productive approach.  
    • Schools should maintain open communication about steps being taken to manage these challenges. While this will be an ongoing process, schools may need seek external resources and support in educating themselves on policies and procedures that are up to date, comprehensive and legally sound.  
    • Since ChatGPT does not explicitly copy text, this raises the question about whether use by students is regarded as plagiarism. On its face, use of ChatGPT may not be plagiarism however it may in fact contravene academic policies. While not a legal obstacle for schools, it is a consideration that educational institutions should consider in the drafting of relevant policies.  


What are the benefits of having an ChatGPT Policy?  

While the permanent bans provide an interim solution for schools to gather information and assess its next steps, this technology is here to stay. We recommend the implementation of an agile policy that addresses the issues raised above and facilitates responsible use.  

By providing a ChatGPT policy for students, schools are taking proactive steps to this new technology  and safeguarding academic integrity. While the policy itself may be subject to ongoing updates and reviews, the creation and implementation of a policy will clarify the schools expectations and response to the challenges raised. Furthermore, it will provide clarity to students and staff regarding their obligations, and serve as a starting point for schools in handling issues regarding the use of ChatGPT. 

If you are an educational institution seeking assistance in creating policies in response to ChatGPT and Artificial Intelligence tools or looking to further understand your obligations, the friendly team at Corney & Lind Lawyers can help. Contact our team today on (07) 3252 0011 or email us at:

Corney & Lind Lawyers provides articles on its website for general and informative purposes only. Any articles on our website are not intended as, nor should they be taken as, constituting professional legal advice. If you have a problem that requires a legal opinion, Corney & Lind lawyers always recommends that you seek independent legal advice that is appropriately tailored to your circumstances from an appropriately qualified legal representative.

This article was written by Courtney Linton & James Tan.



1 FCAFC 62.

2 1990 (Cth).

3 Application programming interface: a software intermediary that allows two applications to talk to each other.