From Books to Bots: Albion Professors Weigh in on AI

A robot works on an assignment while a student looks over its shoulder. Amid the surge in AI usage, educational institutions grapple with rising concerns about academic integrity (Photo illustration by Phoebe Holm).

Artificial intelligence (AI) is defined by the Encyclopedia Britannica as “The ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings.” 

A chatbot itself is an AI program that mimics human conversations using a combination of machine learning algorithms as well as pre-written scripts. ChatGPT, one of the more popular chatbots, has become a focal point of discussion in universities, challenging conventional teaching methods and raising questions about academic integrity. 

When Dr. Mark Bollman of the mathematics department first heard about Chat GPT in December of 2022, he picked lecture questions from his upper 300-level geometry class and inputted them into the platform.

“What I found was that it wasn’t very good at math,” Bollman said. “It’s gotten a little bit better, but it’s not very good at some math topics that I teach.”

Dr. Mauricio Marengoni of the computer science department, who teaches a machine learning course, explained in his own words what Chat GPT is. 

“Chat GPT is basically a big database where, if you have a problem, you can access it quicker than with other platforms,” Marengoni said. “Unfortunately, sometimes your answer won’t be correct.”

Bollman said there are topics “routinely talked about in some of my classes that Chat GPT isn’t handling well yet.”

“Fortunately, we don’t deal with papers in the same sense that other departments do,” Bollman said.

Dr. Christopher Riedel of the history department said to combat students only using AI to write aforementioned papers, he put certain measures in place at the beginning of the fall semester. 

“I have a very long explanation of what you need to do if you use AI. I need to see the prompt, I need to see what you’re getting and you need to explain to me how you modified the material you got,” Riedel said. “It’s basically the same as what I would expect if someone used an outside platform, which is to cite their sources.”

Dr. Joe Lee-Cullin of the Earth and Environment department said a common concern amongst colleagues is the passability of AI-generated papers and what that means for the future of taking home exams.

“When (students) don’t see value in the work you’re asking them to do, then they’re going to find ways to put as little effort as possible into it,” Lee-Cullin said. “However, it’s also on us to check in and make sure people are getting the material we want them to grasp.”

To Marengoni, AI is efficient but doesn’t assist in developing critical thinking skills.

“What are you in college for? If you don’t want to think, you don’t have to come to college,” Marengoni said.

Dr. Brad Chase of the anthropology department said in an email on Oct. 3 that “The costs (of AI) are to students who rely too heavily on AI – if you graduate and all you know how to do is ask the robots for answers, you won’t be very useful to anyone after college.”

This is a concern shared by other institutions as well, such as Saint Anselm College.

According to an article from “The Saint Anselm Crier” titledThe Negative Impacts of AI on Students: Strategies for College Professors,” students who use AI “may not have the opportunity to develop problem-solving skills, leading to a dependency on AI-powered solutions.”

Bollman and Marengoni both said that AI has been impacting student learning longer than the existence of ChatGPT.

“(We got) calculators in the classroom close to 50 years ago now, when the calculator came on the market. Suddenly you could do addition, subtraction, multiplication and division with a machine,” Bollman said. “The rest of the world is just catching up – just having the discussions we’ve been having for years.”

The world of computer science has also witnessed the rise of different AI platforms over the years and their integration into daily life.

“We’ve had some very specific types of AI tools like the GPS that we use to find a route to go somewhere. When we talk about AI, it’s any kind of system that can behave like a human brain,” Marengoni said. 

As AI’s influence continues to grow, it creates discussion amongst professors about Albion College’s future regarding machine-generated responses and media. 

In the history department, Dr. Riedel weighed in on AI art. 

“We need to talk about the problem in the arts like AI illustrators. They’re borrowing from real human beings with no consent and without even acknowledging the work that they’re using,” Riedel said. 

In the computer science department, Dr. Marengoni evaluated the skill of critical thinking. 

“It’s not really important how to find information, but how to use it right. We need to determine how students can analyze results or reason about solutions,” Marengoni said. 

Dr. Bollman, in turn, said that “the mathematical community deals with this stuff all the time. We in turn change what we teach and how we teach it.”

“I think people will eventually embrace it as a useful tool, but I think that’s going to have to change the way that we teach people how to do research,” Lee-Cullin said.

Overall, Albion professors believe AI needs more improvements before taking a primary role in the future of academia. 

“I would want to make it more publicly accessible and expand server capacity. It would be helpful in terms of if I want to embrace it as a tool,” Lee-Cullin said. 

Public accessibility extends to Bollman’s wish “to see more students studying it in my department. We’ve got future computer scientists who are involved with the frontiers of that subject.”

According to reporter Will Henshall of The New York Times, if those frontiers of AI continue to expand, a wider range of people will be able to access answers unknown to experts in those fields. 

“AI has potential. However, anytime we expect what is fundamentally a tool to be a solution in and of itself, and not simply a tool available to us to come up with our own solutions using that tool, there is danger,” Riedel said.

Editor’s Note: 11:43 a.m. Wednesday, Oct. 11. The original publication of this article at 9 a.m. on Monday, Oct. 9 a sentence misattributed to Phil Kormarny of Maryville University and information about the Albion College admissions process have been removed from the story for accuracy.

About Killian Altayeb 22 Articles
Killian Altayeb is from Novi, Michigan and is a second-year student at Albion College. They are a Biochemistry Major with a journalistic interest in all things public health. Contact Killian via email at NA12@albion.edu.

Be the first to comment

Leave a Reply

Your email address will not be published.


*