Navigating the Intersection of AI and Academic Integrity: Professor Jing Lei Weighs In

The rapid rise of artificial intelligence presents a number of challenges for educators. A recent report from the Center for Democracy and Technology (CDT) looks at how teachers are balancing AI’s potential with concerns over academic integrity in K-12 schools.

Jing LeiJing Lei is a professor at Syracuse University’s School of Education who focuses on technology integration in schools. She answers some questions about how teachers and school districts can address these concerns productively and harness AI as a tool for learning.

Q: According to the CDT survey, 52% of teachers say generative AI has made them distrustful that students have actually done the work themselves. And 68% report regularly using an AI content detection tool. How reliable are those tools?

A: AI detection tools give a “probability” of a piece of work being produced by AI, which, even if it’s 99%, doesn’t mean that it is certainly created by AI. The reliability of AI detector tools varies greatly, based on many factors including what algorithms and technology is used and how the tool is trained. 

They also suffer from biases, stereotypes, and the lack of contextual understandings. AI detectors should not be used as the sole source of deciding whether a piece of work is generated by AI. 

A teacher’s best tool in detecting any potential violation of academic integrity is their understanding of their own students. A teacher who knows their students well enough would not need an AI detection tool to determine if something is not right. That understanding and the human connection can never be replaced by any technology.

Q: How does this level of distrust and uncertainty impact students?

A: This level of distrust and uncertainty is harmful to everyone involved: it lowers teachers’ confidence in and passion for teaching, dampens students’ enthusiasm and decreases their motivation to learn, and harms teacher-student relationship—one of the most crucial elements in a healthy and productive learning environment. 

Particularly, a “false positive” accusation of plagiarism can be devastating to students and can have long-lasting detrimental effects on their educational aspirations.

Q: Only 28% of teachers say they’ve received guidance on how to respond to suspected AI use. What should district leaders consider as they set policies?

A: School districts need to help their teachers to develop a sound and evolving understanding of AI technology: what is AI, what AI tools and resources are available, what can AI tools do, what limitations do they have at this stage, what risks do they pose, why is it important for their students as well as themselves to learn how to use AI tools meaningfully, and how. 

To develop this understanding, districts need to provide learning opportunities, resources, and ongoing support to help teachers learn, explore, and experiment with AI tools in their classrooms. 

District level policymaking should involve important stakeholders: teachers, students, administrators, and parents and guardians, consider multiple perspectives, set equitable and inclusive guidelines, and leave room for flexibility based on contextual factors and the evolving nature of AI technology.

“A teacher’s best tool in detecting any potential violation of academic integrity is their understanding of their own students.”

Q: How should teachers approach responsible AI use in the classroom? Are there lessons that will help students learn to productively use the technology while also avoiding some of these academic integrity concerns?

A: There are many ways that teachers can use AI technology with their students productively. For example, they can learn what AI technology can or cannot do, discuss the limitations of AI technology and the importance of human oversight, work together to clearly define what constitutes plagiarism with AI and what is acceptable use of AI, help students develop critical thinking through critically analyzing AI generated content to identify errors and biases, encourage students to using AI for personalized learning, and explore how AI is being used in various working context, etc. 

Through activities like these, teachers can help their students navigate how to productively use AI technology with academic integrity and to develop their digital citizenship.

Originally published by SU News.