We Check Your Homework for AI Usage
We use an automated system to check certain types of assignments for plagiarism and AI-generated content. The one we use is also used by some of the top universities in the world and is very accurate in catching AI-written work, but it is not perfect. If it flags one of your assignments, it means one of two things:
1. You used AI in some way to complete the assignment. This is not acceptable. If you are letting an AI think for you, you are not learning to think. If you let an AI write for you, you are not learning to write. Instead of gaining valuable skills that can help you in your future you are wasting the opportunities we are giving you to actually learn something. There are good ways to use AI to help you learn, but this case is not one of them.
2. You didn’t use AI at all, but your writing is not distinct from what an AI could produce. One of the key writing skills every student needs to develop is their own voice (style) which is unique to every person. Now that generative AI exists, students have an additional challenge: you need to write in a way that shows you are human. If you can’t create work that stands out from what AI can make, why would anyone choose to hire you for anything that an AI can do just as effectively? Your goal is to be better than an AI. Use this as an opportunity to practice this.
In the future, try putting your writing through an AI-checker before turning it in and rewriting it until it does not look like an AI wrote it.
1. You used AI in some way to complete the assignment. This is not acceptable. If you are letting an AI think for you, you are not learning to think. If you let an AI write for you, you are not learning to write. Instead of gaining valuable skills that can help you in your future you are wasting the opportunities we are giving you to actually learn something. There are good ways to use AI to help you learn, but this case is not one of them.
2. You didn’t use AI at all, but your writing is not distinct from what an AI could produce. One of the key writing skills every student needs to develop is their own voice (style) which is unique to every person. Now that generative AI exists, students have an additional challenge: you need to write in a way that shows you are human. If you can’t create work that stands out from what AI can make, why would anyone choose to hire you for anything that an AI can do just as effectively? Your goal is to be better than an AI. Use this as an opportunity to practice this.
In the future, try putting your writing through an AI-checker before turning it in and rewriting it until it does not look like an AI wrote it.
Potts' AI Policy for Students
There are ways to use AI responsibly to help your learning, but using it to answer homework questions or write assignments for you are NOT responsible ways to use it. The purpose of education is to give students the knowledge and skills to be successful in their adult lives, and become responsible, ethical people who benefit—instead of harm—society. Teachers want you to learn to write so you can express your thoughts in a way that others can understand. Teachers want you to learn to think critically so you don’t get tricked by misinformation. If you use AI to undermine your own learning, you will not develop the essential skills you need to learn in order to be truly successful. Don’t waste the learning opportunities you are being given.
Adapted from:
Uche, Nnenna, et al. “Guidelines for the Ethical Use of Generative AI (i.e. CHATGPT) on Campus.” Markkula Center for Applied Ethics, Santa Clara University, www.scu.edu/ethics/focus-areas/campus-ethics/guidelines-for-the-ethical-use-of-generative-ai-ie-chatgpt-on-campus/.
- NEVER directly copy any words used by ChatGPT or any other generative AI.
- Always be careful of the huge biases (one-sided ideas) that generative AI might have.
- Do not trust AI to give you correct information; only use reliable sources when researching important topics (AI is NOT a reliable source).
- Whenever using AI, ALWAYS double-check ALL information against other trustworthy sources to make sure it's correct.
- Use AI as an extra learning tool, not a way to avoid honestly completing schoolwork.
- Before using AI, think of your own skills and the value of solving problems yourself. Are you using it for a learning purpose or just to be lazy?
- Before you use AI, ask yourself if your teacher would approve of how you are using it, and if it follows the academic honesty policies that every learning institution has.
Adapted from:
Uche, Nnenna, et al. “Guidelines for the Ethical Use of Generative AI (i.e. CHATGPT) on Campus.” Markkula Center for Applied Ethics, Santa Clara University, www.scu.edu/ethics/focus-areas/campus-ethics/guidelines-for-the-ethical-use-of-generative-ai-ie-chatgpt-on-campus/.
AI Plagiarism
Plagiarism is not a new idea, and the ways to prevent plagiarism before generative AI existed are the same as what we should do today: use quotation marks to show specific parts that you did not write yourself, paraphrase ideas into your own words, and ALWAYS cite your sources.
If you copy a paragraph from Wikipedia and turn it in as your own, that’s plagiarism, and it’s cheating. If you read a Wikipedia article, then write your own composition that does not use any specific words or ideas from the article but shows a deep understanding of the topic, and then cite that Wikipedia article as the source of your information, that’s not plagiarism. It’s not cheating: it’s good writing.
If you copy what an AI wrote and turn it in as your homework, that is cheating. If you can take something that was written by AI, cite it, and rewrite it so thoroughly that no one could tell that you used an AI, that shows that you have good writing skills. If you can take ideas from AI, cite them, and talk about them so clearly that it shows you actually understand them, that shows good thinking skills.
If you copy a paragraph from Wikipedia and turn it in as your own, that’s plagiarism, and it’s cheating. If you read a Wikipedia article, then write your own composition that does not use any specific words or ideas from the article but shows a deep understanding of the topic, and then cite that Wikipedia article as the source of your information, that’s not plagiarism. It’s not cheating: it’s good writing.
If you copy what an AI wrote and turn it in as your homework, that is cheating. If you can take something that was written by AI, cite it, and rewrite it so thoroughly that no one could tell that you used an AI, that shows that you have good writing skills. If you can take ideas from AI, cite them, and talk about them so clearly that it shows you actually understand them, that shows good thinking skills.
The #1 Problem With AI
The single biggest problem with AI is that it is often incorrect. If it was always wrong, that would actually be better, because students would know that they couldn’t trust it. However, it will give correct information alongside incorrect information, and it does not know which parts are right and wrong—and neither do you. There is actually a name for situations when AI just make up things completely: they are called “hallucinations.”
Here is a real example:
Jien gave ChatGPT a poem called The Emperor of Ice-Cream by Wallace Stevens (you can see the whole poem here) and asked it to identify the symbols in it. This is exactly what it said:
A screenshot is below:
Here is a real example:
Jien gave ChatGPT a poem called The Emperor of Ice-Cream by Wallace Stevens (you can see the whole poem here) and asked it to identify the symbols in it. This is exactly what it said:
- Emperor of Ice-Cream: Symbolizes the celebration of life and the acceptance of pleasure in the face of death.
- Cold Heavenly Haunts: Represents the cold reality of death and the afterlife.
- Bachelors: Symbolize the detachment and impartiality of nature in the face of human mortality.
A screenshot is below:
The first one is correct. The “Emperor of Ice-Cream” in the poem does symbolize the idea that we should find joy in life because death will happen to everyone.
However, look in the original poem for the words “Cold Heavenly Haunts” or “Bachelors.” These are not in the poem anywhere.
ChatGPT gave a correct answer along with two incorrect answers. If you did not read the poem carefully or look closely at ChatGPT’s answer, you would think that the answer it gave looks correct. This is why AI is dangerous for anyone who is not an expert in what they are looking for. If a doctor asks an AI about a medical question, the doctor will know exactly which parts of the answer are correct and which are not. If you ask an AI about a medical question, you don’t know which parts of the answer are correct and which are not.
Generative AI is called “generative” because it is very good at generating (creating) things that look like human writing. That is what it does very well. What its does NOT do is evaluate the sources of the information that it is writing. It does not “understand” anything. It does not “know” what a poem or story is actually about. All it does is put words next to each other that seem to go together. Sometimes those words happen to be correct, and sometimes they are hallucinations.
The only thing that makes you better than AI is that you can evaluate sources. You can understand things. You can know what a poem or story is actually about, and you can check if information is real or just made up.
But, you can only do these things if you learn to think critically and use AI responsibly. Your teachers at Potts are here to help you with this, but we can’t help you if you let AI do your thinking for you.
However, look in the original poem for the words “Cold Heavenly Haunts” or “Bachelors.” These are not in the poem anywhere.
ChatGPT gave a correct answer along with two incorrect answers. If you did not read the poem carefully or look closely at ChatGPT’s answer, you would think that the answer it gave looks correct. This is why AI is dangerous for anyone who is not an expert in what they are looking for. If a doctor asks an AI about a medical question, the doctor will know exactly which parts of the answer are correct and which are not. If you ask an AI about a medical question, you don’t know which parts of the answer are correct and which are not.
Generative AI is called “generative” because it is very good at generating (creating) things that look like human writing. That is what it does very well. What its does NOT do is evaluate the sources of the information that it is writing. It does not “understand” anything. It does not “know” what a poem or story is actually about. All it does is put words next to each other that seem to go together. Sometimes those words happen to be correct, and sometimes they are hallucinations.
The only thing that makes you better than AI is that you can evaluate sources. You can understand things. You can know what a poem or story is actually about, and you can check if information is real or just made up.
But, you can only do these things if you learn to think critically and use AI responsibly. Your teachers at Potts are here to help you with this, but we can’t help you if you let AI do your thinking for you.