When ChatGPT hit the web in the fall of 2022, jaws dropped across the globe. Suddenly, AI chatbots had gone from laughable to impressiveâand a little bit scary. People began predicting that AI would kill education as we know it (we promise that it wonât). Teachers worried that students would use AI to do their homework and write their essays. And schools realized that it was probably time to add an official AI policy to their handbooks.
What is AI-generated content?
Not entirely sure what AI (artificial intelligence)-generated content is? Letâs ask ChatGPT itself! Hereâs what the program said in response to the question, âWhat is AI-generated content?â:
AI-generated content refers to any form of digital content, such as text, images, videos, or audio, that is created by artificial intelligence systems without direct human intervention. These AI systems use advanced algorithms and machine learning techniques to generate content that mimics human creativity and decision-making processes. âŠ
While AI-generated content has the potential to improve efficiency and creativity in various fields, it also raises ethical concerns related to plagiarism, copyright infringement, bias, and misinformation, which need to be carefully addressed when using and sharing AI-generated materials.
Have more questions? Learn more about AI-generated content at Conductor.
Why do schools need an AI policy?
Schools need to create AI policies for the same reasons they have plagiarism policies: to help students understand whatâs acceptable and what isnât. After all, we donât tell students they can never use someone elseâs writing in their own essays. Instead, we explain that they must always recognize and properly attribute any citations they use. This helps students understand that they canât pass someone elseâs writing or ideas off as their own, but they can use them to support their own thinking.
An AI policy should do the same thing. AI isnât necessarily the enemyâthere are lots of legitimate uses for it. But if students use AI to do all their assignments, they wonât learn what theyâre in school to learn. And when the time comes to demonstrate their knowledge when AI isnât available to them (like an in-class test), theyâll likely fail.
So, a school AI policy benefits students as well as teachers. In its current form, AI technology is new to most users, and a good policy helps kids and their families know when and how to use it (and when not to use it).
For more, check out the U.S. Office of Educational Technologyâs AI page.
Is using AI the same as plagiarism?
Some have argued that plagiarism policies are sufficient to cover AI as well. And while AI use and plagiarism have a lot of overlap, there are some important differences.
Plagiarism is copying another personâs work and passing it off as your own. This could be intentional, but it may also be accidental when writers arenât educated on what plagiarism entails. Writers can avoid plagiarism by appropriately citing sources.
AI content is generated by a program, using sophisticated algorithms that pull from a variety of content available on the internet. Depending on the program, the content produced could be plagiarized from another source without attribution. If a writer uses this plagiarized content in their own work, they are also unknowingly plagiarizing.
Of course, the potential for accidental plagiarism isnât the only concern about AI-generated content. But itâs important to make students aware that this is one potential issue with using AI programs.
Learn more about plagiarism and AI content from Medium.
How do schools get started with an AI policy?
Plagiarism policies have been around for a long time, but AI policies are fairly new and you might be unsure how to begin. Here are some steps you might take.
- Gather a team. Put together a team that includes at least one person with a strong understanding of what AI technology is capable of. Others on the team might include administrators, teachers, students, parents, and legal advisors.
- Determine your goals. What do you want your AI policy to include? Will it be separate from other policies, or will you include it in your existing ethics and plagiarism policies? How specific will your policy be?
- Review examples. Take a look at policies written by other schools (see below) and highlight sections you want to include in your own policy. You might even ask ChatGPT or another AI content generator to create some text for you to consider.
- Draft a policy. Put a first draft into writing, and put it through your schoolâs policy review process. Be sure to ask for feedback from teachers, students, and families.
- Make edits, finalize, and publish. Use the feedback youâve gathered to make any necessary edits, and ensure your language is clear and specific. Publish it according to your schoolâs guidelines.
- Educate staff and students. Donât rely on your policy alone to help everyone use AI responsibly. Spend time educating both staff and students on the benefits and risks of using AI. See example lessons at Edutopia.
Take a look at 5 smart ways to encourage academic integrity here.
What should an AI policy for schools include?
A comprehensive AI policy requires much more than just telling students âDonât use AI to cheat.â Schools should be specific in their guidelines, helping everyone understand what is and isnât appropriate. These are some possible sections to include in your policy:
Appropriate AI Use
There are many ways students can use AI as a tool, rather than a way to cheat. Include examples of Dos and Donâts in your policy to help make things clear.
DO:
- Use AI programs as smart search engines that present information in ways that are easy to read and understand.
- Ask AI programs for clarification or explanations when you need help.
- Generate ideas, topics, and writing prompts using AI programs.
- Be transparent; attribute AI text and images properly when you use them in your own work.
DONâT:
- Use AI programs to avoid doing your own work.
- Copy text or images from AI programs without proper attribution.
- Use AI text or images without fact-checking and exploring potential plagiarism issues.
- Use AI when your teacher expressly forbids it.
Get more possible ideas and learn how AI helps students learn in this article from The Advocate.
Responsible AI Use
This section should lay out potential risks of using AI and what responsible use looks like. It should include safety cautions about sharing personal data with AI bots, as well as using them to to invade othersâ privacy.
Your policy should remind students that AI programs can have implicit bias, and even present incorrect information. Anytime they use an AI program, they should think critically and be sure to fact-check using primary sources.
Discover more about responsible AI use in the classroom from UNC Charlotte.
Reporting and Consequences
Use this section to encourage students to report any knowledge they have of AI misuse. Also, lay out the potential consequences if staff discovers a student misusing AI. Will you align it with your plagiarism policies? Consider it an ethics violation? Each school must decide their academic integrity policies for themselves, and AI violations should be a part of it.
Explore a variety of approaches to addressing ChatGPT and academic integrity at AVID Open Access.
Education and Awareness
Schools should commit to educating students and staff about advances in AI technology and its responsible use. Consider requiring students to participate in AI-use education at the beginning of each school year.
Your policy should also clearly state any ways in which the school itself uses AI programs, from data collection and analysis to automatically generated notifications, etc. Note the schoolâs commitment to using AI fairly and safely.
AI Policy Resources for Schools
Thereâs a lot to consider as you formulate your policy. Try these resources to help.
- Alice Keeler: Acceptable Use Policy for AI in the ELA Classroom
- Common Sense Education: How To Handle AI in Schools
- UNESCO: AI and Education Guidance for Policy Makers
- Cleveland State University AI Policy Statement Examples
- University of Missouri: ChatGPT, Artificial Intelligence, and Academic Integrity
- University of British Columbia: Chat GPT and Other Generative AI Tools