Posted by gnabgib 10 hours ago
Former (second-generation) college professor, here. I find it almost impossible to be cynical enough about the US education industry.
This statement is more defensible after removing “only”. If it “only” hurt the cheaters, there would be no need to police cheating at all.
And they'll do it with all the 'unnecessarily high stakes' and 'risk of unconscious bias' and 'not truly representative' problems that written exams have; and a bunch of extra problems too.
Imagine being able to do some writing without notifications going off every few seconds, and where you're not always one click away from a search engine and some website scientifically designed to drag your attention down a rabbit hole and keep it there
Gyms aren't redundant because tractors exist.
We're doing these students a major disservice making them live in the old world. It's our fault for being inflexible, but their world is going to be wholly different and we should just embrace that.
LLMs are also making having a public repo code portfolio be much more worthless as a sign of legitimacy
My colleagues that teach hard skills courses (like data structures and algorithms) either love AI and incorporate it into their teaching at every moment possible, or despise it in the same way graphing calculators were by high school math teachers when they were introduced nearly 30 years ago.
I teach soft skills classes to engineering students, and I'm unconcerned with students using AI. I write my problems in a way such that, if the student truly understands the assignment, prompting the AI to solve the problem and iterating on it takes a similar amount of time to doing the work themselves. AI is not very good at writing introspectively about the student. In other words, AI isn't going to be helpful when the homework question is "A fellow student comes to you asking for suggestions on how to maximize their chances at landing an internship. What advice do you give them that's immediately actionable?"
Try it, plug that into ChatGPT or your favorite LLM. It parrots the same generic tips everyone tells you, with very little on "how" do perform the action in an effective way. Read it, copy it into your advice document, get a poor grade. Try telling other students to take this advice. Note how they don't because the advice isn't actually actionable enough for them to take action.
LLMs are also not very good at the follow-up question "In a previous assignment you gave specific and actionable advice to a peer on the job search. Which of these suggestions were so good you are now doing them?" A number of students write a "Mental Gymnastics" essay, claiming they are following all their suggestions (because they think that's what the professor wants to hear) while the evidence they provide demonstrates they are not. A student asking an LLM to write the essay for them consistently produces a digital 'pat on the back'; a mental gymnastics essay that ultimately makes the student realize how unwilling they are to solve the #1 problem in their college career.
I've done away with exams wherever possible. I stick to project-heavy courses. What I've found to be far more concerning than AI use is the increasing loss of social skills and ability to cooperate within the younger generations. The number of students who would prefer to fail a class instead of talk to literally any human being is astounding.
The number of students who refuse to build soft skills, and believe that tech is truly a meritocracy where the only thing that matters is 'lines of code', there's no politics, and they won't work call or crunch or give code reviews, is also astounding.
My mentor, a PhD in classics, told me it was never about outcomes and only about improvement. I suppose that answers my question. If your AI gets you an A at the start of the course and an A at the end, then, in the sense that you have not succeeded over anything, you have failed.
When I see 'cheat sheets' - designed to be hidden on the back of calculators or whatever - then I see true application of human ingenuity and intellect.
Testing and instruction should be modified to account for AI. If a student uses an Agentic AI for work, learning, research, then when test time comes, the student should be required to stand in the front of the class and teach the class what they have learned, i.e. "Teach Back" all they learned to the entire class student body and teacher. The entire class, instructor included, will also be required to participate in a Q&A session to make sure that student's learning is not just made up of memorization, e.g. restate the information learned but using different words, different scenarios, etc.