Press "Enter" to skip to content

AI in the Classroom

Graphic by Sarah Ogden.

In an anonymous survey conducted by The Sandspur last semester, we found that based on a sample of 119 students, over 71 percent admitted to incorporating AI tools into their academic work. A 2024 survey by the Digital Education Council, which polled 3,800 students from 16 countries, found that 86 percent of students regularly use AI in their studies.  

As higher education adapts to advancements in AI and more professors incorporate AI tools into their curricula, faculty at Rollins are experimenting with how these tools can be used responsibly in the classroom. To understand how AI is impacting different departments at Rollins, we interviewed Professor Matthew Forsythe of the English department and Professor Jasser Jasser of the Data Analytics department.

“With the creative fields, a question has always been, ‘Why do we write these stories?’ So much of the writing of the stories is to express ourselves, and to help other individuals get inside of our minds,” said Forsythe. He added, “With AI, what I’m concerned about is, yes, it can write a story, but it doesn’t actually have a vision for what that story consists of. It is not trying to communicate to you what’s in its mind, because it does not have a mind.” For this reason, Forsythe has been hesitant to incorporate AI into his creative writing classes. 

Jasser discussed how AI has reshaped his teaching style. In years prior, when introducing students to programming with R, he primarily assessed them through “fill in the blank” questions. After AI became proficient at filling gaps between code, Jasser shortened his lectures to 15-20 minutes and devoted more class time to applying concepts through practice. 

“My job mostly is to lecture at the very beginning and then for the rest of the class, I’m just walking around, tracking the progress of each student, interfering and giving advice on the spot,” said Jasser. “I’m more of a curator of knowledge instead of just a person who lectures.”   

Forsythe acknowledged that there are challenges with allowing students to use AI in writing classes, particularly when they are writing essays. “The working of that essay through multiple drafts was a way to work through your thoughts on the subject and to focus on a topic in a sustained way for a long period of time,” Forsythe said. “In a writing class that’s using an LLM, or a generative AI, that’s no longer the case.” 

According to Forsythe, the increased use of AI also makes it harder for students to stand out through their writing. “If you really want to be excellent, now you have to figure out, ‘How is what I’m going to produce be more cogent, more interesting, more well-defined or supported, or simply more original than what anyone else can produce with it?’” Forsythe said. 

For his intro level R course, DTA 250, Jasser said, “I don’t recommend using [AI] until they are capable of understanding the language.” In his upper-level data visualization courses, AI has reshaped the grading process.  

Comparing plots made by previous classes that did not use AI to plots produced by students now, Jasser said, “One of them is good for students who have not used the AI, as in I can grade them, I can find mistakes in there, but for those who are creating their plots with AI, these are perfect plots.” To address this, Jasser asks students to justify their process, answering a series of questions before and after creating their data visualizations.  Students explain their approach, stating why they chose to create a specific plot and criticizing their own work. This is factored into the student’s grade, which Jasser said makes his evaluation feel more abstract. He considers each student’s ideas, not just their execution, which creates more work on his end. “How do you grade perfect work?” Jasser asked.   

“At one point we have to stop grading on the ABC scale. We have to move on to either you complete the work or you don’t,” said Jasser. In an article published by The Orlando Sentinel, he discusses shifting the grading focus to “in-class work” instead of homework.  

While Forsythe allows his creative writing students to use AI in the revision process, he said that AI eliminates the struggle of revising your work — which is how writers grow better — and he warns students “that the AI will attempt always to sand off the interesting edges of the piece because it wants to make things similar to other things.” He added, “The thing that makes good writing good is that it’s weird, that it doesn’t necessarily look like every form of writing that is out there, that it does something that you’ve never seen before, and that character acts in a way that both feels so authentic and so original at the same time.” 

For students trying to stand out in their writing while competing against AI, Forsythe said, “I think that there is, and there will continue to be, an appetite for original stories, especially original stories that come from a specific author. In other words, when I look at my bookshelves, and I see some of my favorite books, I’m immersed in those stories, but I’m also invested, in many ways, in the person who wrote those stories.” Forsythe acknowledged the possibility that new generations may not care if the writing they read is generated by AI, though he doesn’t believe this will happen.  

Forsythe said, “I think that there will always be room for individuals who are writing things that are interesting, and I think we’re going to start to crave that.” 

Jasser expressed concerns about students’ reading ability. “They don’t have the attention span, as in, if I’m going to give 30 students two pages of articles to read, I’m pretty sure only two of them will read the two pages fully.”  

Along with reading articles, Jasser explained that for students to work effectively alongside AI, they need to carefully read AI-generated text, question its accuracy, and check its sourcing. “My biggest issue is that students have blind trust in AI,” he said. “If you just blindly trust it and just take whatever it throws at your screen for granted, then there is something wrong, as in, this is not you doing your due diligence.”  

Jasser referred to Open AI’s Chat GPT, Google’s Gemini, and Anthropic’s Claude as the “big three” AI chatbots. He recommended Claude, though he noted that another company could produce a better model in the near future. “If I want to say an AI that is safe to use, that will kind of measure itself before giving you the answer, I’ll probably say Claude would still be the safest. Gemini will search the internet and give you an answer, and sometimes their references are not good. ChatGPT, they are always trying to stay ahead of the curve, and they’re always introducing all these new experiences that might affect academia.” 

Forsythe’s hope for the future of AI is that it doesn’t eliminate creativity, but rather provides more time for engaging in it, along with other fulfilling activities. “I would love to use the AI to help with things that I really want to get done quickly, so that I can immerse myself in the parts of life that I love, including writing, or reading good books, or things of that nature,” Forsythe said. “I want to use the AI to enable those other parts to become more rich, not to let it take over the parts of my life that I find deep satisfaction in, leaving me only with the parts that were tedious to begin with.” 

Considering how AI may be used to improve our lives, in recent years, a significant number of young adults have consulted AI for mental health advice.  When asked about using AI-chatbots in place of human therapists, Jasser said, “I don’t believe in that. No, don’t do that. For that, you really need a human.”  

“You have to understand that the AI doesn’t take whatever you say and think about it as humans do from an emotional perspective and reply to you. The AI takes your text and creates a sentence or a paragraph by predicting the next word in that paragraph that would be the most applicable to the question you asked,” he said.  

While we are still learning about the long-term effects of AI usage, and educators are developing different strategies for implementing AI in their classes, ultimately, students are responsible for how they choose to use or not use AI tools. Despite representing different departments, Jasser and Forsythe both encourage students to use AI to support their thinking rather than allowing AI to think for them. 

Comments are closed.