“If the AI takeover of higher ed is nearly complete, plenty of professors are oblivious.” Ian Bogost.
I’m preparing to greet the incoming college class of 2029. I teach an intro cultural anthropology class, and it’s early in the morning, so I’m usually the first professor they encounter in a classroom. I thought last year was going to be tough, facing the kids who just entered high school when Covid hit. This semester, the challenge isn’t the repercussion of a past catastrophe; it’s the onrushing threat of AI that awaits these students when they arrive at the University of Texas, and then broadly throughout their lives.
On Monday morning (18th), as I was preparing my opening remarks for the first class day, I received a university-wide email, detailing all the new AI resource that are being made available to students and faculty through UT.AI. The most prominent is UT Spark, which promises to “work out ideas, summarize research, and brainstorm ideas”—all things I teach them to do on their own! Instead of turning to a TA for help, students are encouraged to use our “in-house AI tutor platform,” UT Sage. The email declares, “This is more than an announcement; it is an invitation to collaborate, innovate and lead together,” and it assures me that these tools were configured with “academic integrity.” This is an “ethical AI.”
Ian Bogost writes about dealing with the class of 2026, which has had AI available to them for most of their college career. He covers a lot of ground, but what jumps out at me most is his assessment that the faculty really don’t comprehend the magnitude of the transformations occurring in our students lives and how they think. Based on interviews with colleagues, he concludes that “faculty simply fail to grasp the immediacy of the problem. Many seem unaware of how utterly normal AI has become for students. For them, the coming year could provide a painful revelation.”
Over the past 3 years, I’ve not consternated over the rise of AI. I teach students in all my courses—upper division and intro—how to do ethnography. This involves detailed social observations over a period of time. AI can’t do that. I’ve stressed to them, too, that if an AI can do their work for them, then they’re probably obsolete. And AIs indeed are undercutting a lot of entry level jobs that college grads typically land. So I have not worried about them “cheating” by using AIs for their assignment. Instead, I’ve demonstrated in class how I use. I focus on where and how it often goes wrong or hallucinates. I don’t try to frighten them with AI doom scenarios.
Now my stance—going into my first class on Monday (25th)—is to inform them of the “cognitive costs” of using AI and its deleterious effects on social skills. Since this is an intro cultural anthropology course, I’ll send a lot of time on the latter. I’ll put it to them that college is their last chance to learn how to think, and they better get good at it, because very powerful forces are arrayed against them; the lure of succumbing to “brain rot” is intensifying. My aim is to teach them to think like an anthropologist, emphasizing the centrality of imagination in our species’ flourishing, the crucial power of memory, and the mental challenges of navigating social spaces. All of this is dramatized in my novel, The Last Cohort.
If you’re teaching a class this fall, please let me post some comments on how you’re approaching these challenges.
It took me 15 years to get my BA -- with 10 years off in the middle -- and I object to the idea that one needs college to learn to think. Our mutual friend the late Michael B never graduated from high school, never went to college -- and no one who met him could doubt his independent thinking.
So much of what you say about memory also applies to literacy and even more to computers; I know how much I rely on the internet to check my memory, to look up words (I used to go to a dictionary), to check sources (I know which author said it and don't remember where they said it -- but the internet often -- not always -- knows) -- just this morning I looked up an A. A. Milne poem, one line of which I could remember.
The step beyond these capacities that so-called AI takes is to synthesize information. I'm not at all persuaded that it's very good at it. And even synthesizing information isn't the end of thinking: perceiving connections that nobody has perceived before -- that's where originality comes in. And AI -- I'm not aware that it has that capacity at all.
You mention hallucinations -- the problem, as I see it (and maybe this is changing) is that AI doesn't provide its sources. I hope that's changing. Because without providing its sources, it's worthless for scholarship -- worse than Wikipedia, which theoretically at least requires sources. My understanding is that AI doesn't provide sources because the companies that developed it don't want to acknowledge their source materials. So -- unless that aspect has changed, it's unethical from the get-go, top to bottom.
Best wishes with your school year!
I’m horrified, John, that UT is so pliantly willing to hand over the keys of education, of the stewardship of human knowledge, to the AI Imperium. And that they are encouraging their brilliant professors to be handmaids in the service of The Great Plagiarism (which is what the “large language model” is).
So I love your speech to the incoming frosh. I was trying to think about what I would say to them, though I ain’t no anthropologist so I may be way off- base. And it might not resonate with students at all. It might be too high-horse and highfaluting. But this is what I came up with:
Cultural anthropology is human beings trying to figure out other human beings… which, when you think about it, is a very human thing to do. It’s important stuff, for the future of our society, but anthropology – the process of it – can be fun, as well. So when we ask machines to do our anthropology for us, we are giving up that opportunity for a deeper understanding of our fellow human beings; we are giving up the chance that each of us might be able to come up with some original insights, or even solutions for the things that plague us; we are giving up on the joy that comes from personal discovery; we are giving up a bit of our humanity.