The independent news website of The Kent Stater & TV2


The independent news website of The Kent Stater & TV2


The independent news website of The Kent Stater & TV2


Follow KentWired on Instagram
Today’s Events

How AI is perceived in classrooms leading up to finals

Brittany Lucia
The CCI Student Lounge has a computer lab that all CCI students have full access to and can work on assignments and projects on Oct. 19, 2023.

With finals quickly approaching, the stress from them can be tempting for students to turn to artificial intelligence to help with assignments. 

This semester, instructors discussed the use of AI in their classroom through their syllabi by highlighting both the beneficiary factors along with potential ethical risks of it. 

Matthew J. Craig, a Ph.D. candidate in the College of Communication and Information, is a Human-Machine Communication researcher. In other words, Craig researches and studies the way humans interact with and through machines. 

Craig said during his academic career, his research and schooling allowed him to study the way AI can impact the surrounding environments it’s placed within. 

“My research and education really coincide with how I’ve looked to understand how AI can impact society and impact what it means to communicate with people,” Craig said. 

With the rise of AI, there is still research that needs to take place in order to get a grasp of the influence AI can have, especially with the use of it in education, Craig said. 

“I think that there’s still research to be done to understand its impact in the classroom,” he said. “It’s important that users in general understand a number of the limitations for these technologies, the biases they can have, and the lack of explainability for how it gets its answers.” 

Craig said with ChatGPT, a user inputs data and the software gives the user a response in return. This raised questions of how it got the information in that response and how can the user know that the information given in the response was actual information the user was looking for. 

Tom Schindler, associate director for the Information Technology Security team, said he urges students to use AI with caution, especially with sensitive information . 

“I wouldn’t recommend putting any kind of personal data into the AI chatbot as it is right now,” he said. 

Schindler said to think about the information that is being put into an AI platform and question if that is information the user would want displayed.  

However, some students see AI platforms as a useful tool, but also have their concerns over proper use of the ChatGPT for example. 

Christopher Shaffer, a senior aerospace engineering major, said AI platforms such as ChatGPT can be beneficial to students but should not be abused. 

“I think it’s definitely a useful tool,” Shaffer said. “I don’t think it should be what you turn in.” 

Shaffer said users of ChatGPT should fact-check the information given by the platform, but still believes it to be beneficial to students. 

“You still have to be cautious with what it’s telling you,” Shaffer said. “So you should definitely back it up with other information. But, I do think it’s very useful to students.” 

Madeline Irwin, a sophomore psychology major, said with the use of AI, students can use those platforms to complete assignments for them and expressed concern that students are not using the platforms ethically.

“I think that students are misusing it,” Irwin said. “I know a lot of students are like using it to fully write papers for them. I don’t think that should be happening, obviously that’s literally plagiarism.” 

Irwin said if students are using AI to help boost creativity and generate ideas for something, then there is nothing wrong with that. An issue occurs when students rely on AI to complete assignments for them. 

“If you’re using [AI] to help get, like ideas or like help for certain topics, I think that’s OK,” Irwin said. “It can be very helpful for ideas for you, but it should only be used as ideas.”

While most instructors talk about the use of AI on their syllabi, some professors bring the use of it into their classroom.

Ruoming Jin, a computer science professor, said he used ChatGPT in his lectures due to the heavy mathematical background his course requires. 

“I actually allow them to use ChatGPT to help them with their math homework,” Jin said. “ChatGPT actually makes mistakes. The answer for the math problem is not always accurate, so the student really has to understand what’s going on.” 

While students using AI for assignments is up for debate, the software and intelligence behind it will continue to change, Schindler said.

Craig said it is important that individuals begin to teach themselves about the new emerging technology and to start thinking about how AI can be incorporated into the lecture halls. 

“I think it’s very important that we begin to educate ourselves about this technology,” Craig said. “How can we incorporate [it] in the classroom and like how are we going to teach students how to properly use these tools?”

MinJee Yoo is a reporter. Contact her at [email protected].

Leave a Comment
More to Discover

Comments (0)

Your email address will not be published. Required fields are marked *