The conversation surrounding the academic use of AI typically centers the potential harm to students’ learning. When English majors are using AI to write everything from a full essay to a 100-word discussion post, and computer science majors rely on DeepSeek for coding assignments, how could it not do educational damage? Certainly, the gratuitous use of AI in schooling will negatively affect students. But this detriment doesn’t just stem from students using AI as a substitute for learning, but also from professors who use AI as a substitute for teaching.
Within days of ChatGPT’s introduction, students adopted AI into their academic routines: editing essays, solving statistics homework, and generating ideas. While AI may increase accessibility, quality of education, and personalization of resources, it also leads to cheating, plagiarism, and overdependence on technology. And though AI has become deeply ingrained in our academic lives, it is relatively new, and its impact is relatively unknown. We know that AI is a shortcut, but we don’t know if it will ultimately function to expedite our learning processes or substitute for them.
Now, in what feels like the second wave of AI usage—in which every company has added “AI-powered” to its slogan—professors have caught on to the shortcut as well. Many UChicago professors allow AI as an editing tool for essays. Some allow any AI usage with citations. And some have even swapped teaching assistants and p-sets for AI substitutes. Though these tasks can help students learn to navigate AI in an increasingly technological society, problems arise when professors completely replace teaching with AI.
Last quarter, my professor for Public Policy Analysis implemented AI resources such as AllDayTA and Personify into our coursework. Both resources use course materials to answer questions and supply practice problems, marketing themselves as tailored teaching assistants. Though the resources are occasionally helpful in answering simple course-related issues, they are, overall, an inhibitor to learning. AllDayTA generally supplies identical practice problems to those from the textbook, homework, or class notes, and it answers little more than basic questions about the course material. The resource is practically obsolete for any student with the syllabus or textbook open, saving only a few seconds of navigation. Even further, students are required to complete all of their homework on Personify, which is the more problematic substitute for teaching. Not only is the resource frequently incorrect—on multiple occasions, I and other students had to submit the same answer at least three times before Personify solved the equation for itself and discovered that, yes, seven does equal seven, having wasted not only time but resources in the process—bur Personify gives students the answer with little to no pushback. If a student struggles (or pretends to struggle) for the answer for three queries or more, Personify will solve the problem for them, asking the student to check the math and submit the spoon-fed answer. While some might use these resources for their intended purpose—as an explanatory tool to help stuck users navigate out of roadblocks—most students, managing a full course load and other stressors, fall to the temptation of an unearned answer.
So, students take to Personify because it hands them an easy A. In fact, the average on every Personify-based homework assignment was over 94 percent—in stark contrast to the D-minus average on the midterm and C average on the final. In a class that touts a no-computer policy and cites the benefits of handwritten notes, AI is not accelerating but completely substituting the teaching and learning processes.
Outside of this class, the examples are endless. A friend’s creative writing professor required students to edit and submit an AI-generated essay. In another friend’s graduate-level class, students were allowed to use AI if they could edit pieces enough to disguise their usage. And, most existentially, an email recently went out from the Human-Robot Interaction Lab to the Department of Computer Science detailing a potential research project meant to create “a robot that can teach pairs of children social emotional learning skills” and another “autonomous robot system that can be deployed in a child’s home.” This last example alone seems to entirely threaten the sanctity of our most formative human relationships, reshaping all forms of learning, from the home to the classroom.
Homework is one of the most fundamental components of students’ learning, particularly for STEM classes, which require students to complete practice problems to succeed at later stages of the course. And yet, as resources like Personify and AllDayTA mislead both professors and students into believing they have a better grasp on the material than they do, AI becomes a shortcut to an A but a substitute for teaching and learning. Consequently, when the time comes for AI-unassisted learning, students can’t replicate the processes they’ve merely skimmed for homework, banking on grading curves rather than genuine, long-term understanding.
Shortcuts help us get places faster. We take the hypotenuse when we can, use Excel spreadsheet hacks to expedite the mundane, and take the off-ramp when we see traffic. If there’s a faster way to accomplish something, we’ll take it. But learning and teaching are processes that depend on trial and error, collaboration, and time. AI shortcuts are meant to be efficient, but learning is, at its core, inefficient. With AI, teachers offer students a fast lane that gets them to an A faster but bypasses the slow yet necessary learning process. Sure, spending less time generating problems, understanding concepts, or finding resources can speed up learning and even out the academic-financial playing field, but, in these examples, the learning process is not expedited but sidestepped. When AI circumvents the learning pathway, it undermines students’ ability to understand the material, especially when not only students but teachers elect to use it.
AI is still in its early stages, so, naturally, we are also in the early stages of its applications. It will take time for professors to find the right tools that accelerate rather than replace teaching and learning. Until then, both students and professors will have to resist the temptations to take shortcuts that may get us where we’d like faster, but on a much less stable foundation.
Camille Cypher is a third-year in the College.