Remix.run Logo
Purdue University approves new AI requirement for all undergrads(forbes.com)
54 points by rmason 6 hours ago | 42 comments
dehrmann 4 hours ago | parent | next [-]

Full disclose: I'm a Purdue graduate, though I disagree with certain things the school has done (Purdue Global).

Part of this is very reasonable; AI is upending how students learn (or cheat), so adding a requirement to teach how to do it in a way that improves learning rather than just enhances cheating makes sense. The problem with the broad, top-down approach is it looks like what happens in Corporate America where there's a CEO edict that "we need a ____ strategy," and every department pivots projects to include that, whether or not it makes sense.

daxfohl 4 hours ago | parent [-]

I like this take. It seems like it would be useful to require professors to sit in on the class too. It'd be interesting to hear lots of different perspectives, ideas, concerns, etc., rather than a lecture format to half-awake students about something they arguably know more about than the instructor.

mwkaufma 4 hours ago | parent | prev | next [-]

Heads up: forbes.com/sites/xyz are ppl and groups who pay for the domain, but aren't edited or promoted by forbes itself. Almost always conservative interest groups posing as journalists.

mossTechnician 2 hours ago | parent | next [-]

Additional information about Forbes' downward trajectory: https://larslofgren.com/forbes-marketplace/

andy99 4 hours ago | parent | prev [-]

Yes this has conservative psy-op written all over it /s

mwkaufma 4 hours ago | parent [-]

Nietzel's whole shtick is "college reform" i.e. dismantling and financialization. See his book "Coming to Grips with Higher Education." Mixing non-agitprop into the feed is part of agitprop.

capyba an hour ago | parent | prev | next [-]

What exactly is an “AI working competency”? How to have a conversation with a chatbot? How to ask a chatbot a question that you confirm with a Google search and then confirm that with a trusted online reference and confirm that with a book you check out of the library three weeks later?

Perhaps the world is going the direction of relying on an AI to do half the things we use our own brains for today. But to me that sounds like a sad and worse future.

I’m just rambling here. But at the moment I fail to see how current LLMs help people truly learn things.

noitpmeder 5 hours ago | parent | prev | next [-]

How to Speedrun devaluing the credentials your institution exists to award.

conartist6 5 hours ago | parent | prev | next [-]

Well that's a public embarrassment...

andy99 5 hours ago | parent [-]

That was my thought, it feels like something a career college or high school would do. Are CS students going to have to take a “how to talk to chat gpt course”? That’s probably less condescending than making an arts student or someone else that doesn’t need to have anything to do with LLMs have to sit through it.

I though Purdue was a good school, these kind of gimmicks are usually the province of low-tier universities trying to get attention.

turtleyacht 4 hours ago | parent | next [-]

Optimistically, the idea could be to push prerequisites to an always-on, ever-available resource. Depending on the major, skills could include organizing papers into outlines, using Excel, or building a computer.

Professors can tailor lectures to narrower topics or advanced, current, or more specialized subjects. There may be less need to have a series of beginning or introductory courses--it's assumed learners will avail themselves.

Pessimistically, AI literacy contributes to further erosion of critical thinking, lazy auto-grading, and inability to construct book-length arguments.

basch 2 hours ago | parent | prev [-]

> “how to talk to chat gpt course”?

it's not unrealistic to be selecting for people with strong language skills and the ability to break tasks into discrete components and assemble them into a process. or the skill of being able to define what they do not know.

a lot of what makes a person good with an llm makes them also good at general problem solving.

djoldman 5 hours ago | parent | prev | next [-]

The announcement is here:

https://www.purdue.edu/newsroom/2025/Q4/purdue-unveils-compr...

Where the actual news is:

> To this end, the trustees have delegated authority to the provost, working with deans of all academic colleges, to develop and to review and update continuously, discipline-specific criteria and proficiency standards for a new campuswide “artificial intelligence working competency” graduation requirement for all Purdue main campus students, starting with new beginners in fall 2026.

So the Purdue trustees have "delegated authority" to people at the University to make a new graduation requirement for 2026.

Who knows what will be in the final.

gmfawcett 4 hours ago | parent [-]

Delegated to the provost and deans. Who else would you expect to hold accountable for developing a graduate attribute?

djoldman 2 hours ago | parent [-]

I guess they would already have had that authority?

I think it would be the ongoing job of the dean's or at least someone to be setting graduation requirements? Why would the trustees have to explicitly delegate it?

whatever1 2 hours ago | parent | prev | next [-]

Realistically universities will have to ban the usage of computers for exams and homeworks.

For the same reason that elementary schools don't allow calculators in math exams.

You first need to understand how to do the thing yourself.

smegger001 2 hours ago | parent [-]

realistically you can't ban computers for home work as you dont control the environment. and as for banning them for exams that may work for most of the humanities math chemistry and physic, but good luck trying to teach a computer science degree without computers, or graphic design, or any of a number of of programs that are reliant on them if you want student to be competent with standard tools of their trad which are on computers. audio engineering computer video editing computers

vostok 2 hours ago | parent | next [-]

Outside of distance learning - I think computers are very rarely used for CS exams in my, albeit dated, experience.

whatever1 an hour ago | parent | prev [-]

People not long ago were literally programming on paper.

So no, computers are not required to teach computer science.

jleyank 4 hours ago | parent | prev | next [-]

From my long-ago uni courses, current-day AI could have helped with the non-major courses: English and History, doing the first draft or even the final drafts of papers, etc. As a science major, I'm not sure what the point of relying on an AI is as it would leave you empty when considering further education or the tests they require. And as far as a foreign language goes, one needs to at least read the stuff without relying on Google Translate (assuming they have such a requirement anymore).

But I like to think that actually learning the history was important and it certainly was a diversion from math/chemistry/physics. I liked Shakespeare, so reading the plays was also worthwhile and discussing them in class was fun. Yeah, I was bored to tears in medieval history, so AI could have helped there.

thfuran 3 hours ago | parent | next [-]

>As a science major, I'm not sure what the point of relying on an AI is as it would leave you empty

Why do you think it wouldn't do the same for other fields? The purpose of writing essays in school is never to have the finished product; it's to learn and analyze the topic of the essay and/or to go through the process of writing and editing it.

conartist6 4 hours ago | parent | prev [-]

It'll get you an academic integrity investigation if you get caught using it to write either a first draft or a final draft of a paper, and especially for an English class where the whole point is for you to learn how to write.

If you're going to try to fake being able to write, better to try to dupe any other professor than a professor of English. (source: raised by English majors)

jleyank 3 hours ago | parent [-]

Hope so. But if you can’t use it here, where CAN you use the thing??

brian-armstrong 5 hours ago | parent | prev | next [-]

https://archive.ph/g1a1X

gamblor956 5 hours ago | parent | prev | next [-]

This is going to be like when all the schools were pushing big data because that was going to be the next big thing.

After more than a trillion dollars spent, LLMs can replace: (a) a new secretary with one week of experience (b) a junior programmer who just learned that they can install programs on a desktop computer, and (c) James Patterson.

That's the bright future that Purdue is preparing its students for.

Yes, AIs will be a huge thing...eventually...but LLMs are not AI, and they never will be.

jart 3 hours ago | parent | next [-]

I hope Anthropic is saving all my interactions with Claude so they can replace me when I'm gone.

Then future generations who like old school systems hacking will be able to pair program with Justine AI.

SOLAR_FIELDS 3 hours ago | parent [-]

This is a much lighter take than mine which is that our behaviors being input into this system will eventually be used to subjugate and control future generations. I like it

andy99 5 hours ago | parent | prev | next [-]

This has nothing to do with whether the technology is valuable or not, it’s about cramming superficial treatment of trendy topics into academic degree rewuirements, which whatever one thinks of AI should be frowned upon.

ivape 4 hours ago | parent | prev [-]

It's definitely something that won't age well. Kids are going to grow up with many AI friends by the time they get to college.

bigstrat2003 an hour ago | parent [-]

If that's the case it sounds like universities will need to hire an army of psychologists to undo the damage that will have been done to those kids. Treating LLMs as a friend is profoundly unhealthy and will not end well.

AndrewKemendo an hour ago | parent | prev | next [-]

I was on the academic board of engineering mechanics for Purdue almost a decade ago.

Purdue not necessarily uniquely but specific to their charter does a really good job at workforce development focus in their engineering. They are very highly focused on staffing and training and less so on the science and research part - though that exists as well.

This tracks what I would expect an in line with what I think it should be best practice

danaris 2 hours ago | parent | prev | next [-]

If they were to set down what the curriculum needed to meet such a requirement would be today, by the time the students who matriculate in August graduate, it will be so out of date to be effectively worthless.

This is not remotely the kind of thing that a school should be making a requirement at this time. The technology is changing way too fast to even be sure that basic fundamental skills related to it will remain relevant for as many as 4-5 years.

65 3 hours ago | parent | prev | next [-]

Seems mostly knee-jerk reactionary more than anything. I'm sure this is to justify hiring even more administrators.

bgwalter 3 hours ago | parent | prev | next [-]

https://www.purdue.edu/newsroom/2025/Q4/purdue-unveils-compr...

"all as informed by evolving workforce and employer needs"

“At the same time, it’s absolutely imperative that a requirement like this is well informed by continual input from industry partners and employers more broadly."

Purdue is engaging in the oldest profession in the world. And the students pay for this BS.

turtleyacht 5 hours ago | parent | prev | next [-]

Upfront computer literacy may have never been convincing enough; AI could be the ubiquitous and timely leverage to open the way for general machine thinking.

keiferski 5 hours ago | parent | prev [-]

I don’t really get the dismissive comments here. Universities have had gen ed requirements for years, one of which is usually something to do with computers. AI seems to be a technology that will be increasingly relevant…so a basic gen ed requirement seems logical.

BeetleB 4 hours ago | parent | next [-]

The problem is the field is changing way too fast. It's almost certain that whatever they'll learn will be outdated/wrong/poor practice by the time they graduate. Just compare with the state of things 2 years ago.

bigstrat2003 an hour ago | parent | prev | next [-]

> AI seems to be a technology that will be increasingly relevant

That's why you don't understand the dismissive comments. The reality is that the technology sucks for actually doing anything useful. Mandating that kids work with a poor tool just because it's trendy right now is the height of foolishness.

UncleEntity 4 hours ago | parent | prev | next [-]

Yeah, I'm still bitter I had to pass a literacy exam to get my BA and that was 28 years ago.

And I just know this is going to turn into a (pearl-clutching) AI Ethics course...

alephnerd 4 hours ago | parent | prev [-]

These are the same people who would pooh-pooh teaching Excel and basic coding skills to non-STEM majors or have CS students take ethics or GenEd classes.

AI/ML isn't going to completely shift the world, but understanding how to do basic prompt engineering, validate against hallucinations, and know what the difference between ChatGPT and GPT-4o is valuable for people who do not have a software background.

Gaining any kind of knowledge is a net win.

hansmayer 4 hours ago | parent [-]

"basic prompt engineering" - Since when has writing English language sentences become nothing less than "engineering" ?

IncreasePosts 2 hours ago | parent [-]

It's more about knowing the tricks to get llms to give you the output you want.

However, there's no reason to think any trick would be relevant even in a year. As llms get better, why wouldn't we just have them auto rewrite prompts using appropriate prompt engineering tricks?