When sophomore Aubrey Blush received her grade for an essay, she wasn’t happy.
It wasn’t the content that bothered her, or even her own performance. Instead, it was how it was graded—or rather, who graded it.
Her teacher used an artificial intelligence (AI) tool called Brisk to assist with offering feedback and assigning a grade. However, Blush felt that she had taken it too far.
“This really made me feel a little disappointed because I felt as if my education was not being taken seriously,” Blush said.
The question of where AI fits into the educational narrative has become more and more pressing since ChatGPT and similar tools surfaced in 2022. Originally, the concern was how students would abuse the intuitive technology. Now, however, many school districts across the country, Brighton Area Schools included, have placed more focus on AI use by teachers.
While some students like Blush are concerned about how AI might degrade the traditional role of educators, many administrators and teachers are focusing on the benefits. As many people have overcome their initial hesitancy about the technology, they are more open to exploring the variety of tools and possibilities the world of AI has to offer.
“The district seems to be very encouraging of using AI in positive, beneficial ways, and less concerned with the punitive aspects of students doing bad things,” said Kimberly Christiansen, who teaches AP English Language and Composition, Yearbook and Myth and Sci-Fi at Brighton High School. “So they’ve been encouraging us to use it to help with instruction or to use it to help with giving feedback and grading and those kinds of things.”
One of the ways that Christiansen noted the district has encouraged this is through providing resources and speakers to offer guidance and ideas. Eric Curts, a technology integration specialist in Northeast Ohio and the founder of educational blog Control Alt Achieve, leads workshops across the country to discuss the ethical use of AI and some of the resources available to educators. He has already visited BHS twice and is returning in February 2025.
Curts said that he has “been a proponent of AI in education probably for the last ten years,” but he also recognizes the concerns around it.
“It’s a tool, just like a hammer. You can build with a hammer; you can destroy with a hammer,” he said. “Same with AI. It could help, it could assist, it could support, or we could use it in ways that are not appropriate. So it’s important that we don’t just talk about all of the positives, but we address the potential negatives.”
Curts recommends that all students and teachers familiarize themselves with one of the “core language models”—ChatGPT, Gemini, Claude and other large AI models that have the ability to address an expansive variety of topics and questions. However, he is also an advocate for what he calls “prompt tools.” These tools allow educators to choose from a list of pre-generated prompts, then customize the type of material they want produced. The tools then use larger engines like ChatGPT to generate the result, making the process easier for teachers to navigate.
“Those are great tools, especially for teachers that are still maybe not as comfortable,” Curts said.
Christiansen has had experience with tools like this in the past, particularly Brisk AI, which she said that she uses on a regular basis to help create rubrics, tailor quiz questions and offer feedback on students’ work. However, she also noted how these experiences have made her realize the drawbacks of AI and the fact that it is not always right.
Given the sometimes flawed nature of AI and the room for potential abuse, the question of what constitutes cheating—especially among teachers—is murky.
Curts offers a loose definition: “It’s what’s the objective.”
“[The teacher’s] job is to help students learn material, support them [and] help them to achieve and learn,” Curts said. “So if I’m a teacher and I use AI to make a better set of quiz questions, well, that’s good because that’s helping my students. If I am a teacher and I use AI to brainstorm better ideas for a lesson, that’s good because that’s supporting my goal.”
Where Curts draws the line is when teachers lose sight of the goal of being the principal supporter of their students. For example, he defines having AI grade students’ essays as going too far.
“What I just would say is, yeah, let the AI help, but you still have your piece of the pie,” Curts said. “You’re still ultimately responsible.”
As a teacher, Christiansen holds a similar perspective.
“I personally would not use it to grade an assignment,” she said. “I would use it to give feedback, and then we’ll go with the feedback and just make sure, you know, tweak it if I needed to, but I would not use it to just grade and not actually grade it myself.”
Blush also said that when teachers use AI beyond a complementary usage, it can damage the perception that students’ education is valued.
“AI can be a great tool when used properly, but when it isn’t, it can leave students feeling like their teacher doesn’t care enough about them to actually take the time to grade them,” she said.
Despite the complications and controversy surrounding AI use by teachers, Christiansen pointed out that society is in a transitional phase when it comes to adapting to the new technology.
“I think we’re at, I guess, a turning point or a major shift in just how we do things as a society, and I think that we’re scrambling in education to figure out how to handle that, to figure out what the appropriate use of AI is and what is not so appropriate,” she said.
This article is one component in a series about efforts to integrate AI usage into education at BHS and the controversy surrounding it. Future stories will dive deeper into what constitutes ethical AI usage among students and further measures the school is taking to adapt to changes in AI technology.