Remix.run Logo
lukev 5 hours ago

This is bad in tech. But at least we are (relatively) well equipped to deal with it.

My partner teaches at a small college. These people are absolutely lost, with administration totally sold on the idea that "AI is the future" while lacking any kind of coherent theory about how to apply it to pedagogy.

Administrators are typically uncritically buying into the hype, professors are a mix of compliant and (understandably) completely belligerent to the idea.

Students are being told conflicting information -- in one class that "ChatGPT is cheating" and in the very next class that using AI is mandatory for a good grade.

Its an absolute disaster.

Terr_ 2 hours ago | parent | next [-]

I've been telling my curious/adrift relatives that it's a machine takes a document and guesses what "usually" comes next based on other documents. You're not "chatting with it" as much as helping it construct a chat document.

The closer they can map their real problems to make-document-bigger, the better their results will be.

Alas, that alignment is nearly 100% when it comes to academic cheating.

chatmasta 4 hours ago | parent | prev | next [-]

The wild part is they’re having this reaction while using the most rigid and limited interfaces to the LLMs. Imagine when the capabilities of coding agents surface up to these professions. It’s already starting to happen with Claude Cowork. I swear if I see another presentation with that default theme…

iugtmkbdfil834 4 hours ago | parent [-]

This. As annoying as all sorts of 'safety features' are, the sheer amount of effort that goes into further restricting that on the corporate wrapper side side makes llm nigh unusable. How can those kids even begin to get the idea of what it can do, when it seems like its severely locked down.

pjc50 4 hours ago | parent [-]

Could you provide an example of such a thing that is prevented?

iugtmkbdfil834 an hour ago | parent [-]

Sure. In the instance I am aware of, SQL ( and xml and few others )files are explicitly verbotten, but you can upload them as text and reference them that way; references to personal information like DOB immediately stops the inference with no clear error as to why, but referencing the same info any other way allows it go on.

It is all small things, but none of those small things are captured anywhere so whoever is on the other end has to 'discover' through trial and error.

metalliqaz 3 hours ago | parent | prev | next [-]

By my understanding, the administrators at small colleges are among the least capable professionals one might find anywhere in the economy.

throwawaysleep 21 minutes ago | parent [-]

A friend and I have a contract with a local university here in Canada.

They paid for custom on prem software and in over a year, they have not fully provided both access and infrastructure for install it.

We have been paid already, but they paid for a tool they can’t get their shit together enough to let us install.

whattheheckheck 4 hours ago | parent | prev | next [-]

When industrialization was taking root yes indeed the factory jobs sucked AND it was the future. Two things can be true

ares623 20 minutes ago | parent [-]

You left out the part that the non-factory jobs sucked more (or were just non-existent).

This is the opposite.

webdood90 4 hours ago | parent | prev | next [-]

> These people are absolutely lost, with administration totally sold on the idea that "AI is the future" ...

Doesn't sound that different from my tech job

jakelsaunders94 4 hours ago | parent | prev | next [-]

This is really interesting. I've been out of education for a long time, but I was wondering how they were dealing with the advent of AI. Are exams still a thing? Do people do coursework now that you can spew out competent sounding stuff in seconds?

Al-Khwarizmi 2 hours ago | parent [-]

I teach CS at a university in Spain. Most people here are in denial. It is obvious to me that we need to go back to grading based on in-person exams, but in our last university reform (which tried to copy the US/UK in many aspects) there was so much political posturing and indoctrination about exams being evil and coursework having to take the fore that now most people just can't admit the truth before their own eyes. And for those of us that do admit it, we have a limited range of maneuver because grading coursework is often a requirement that emanates from above and we can't fundamentally change it.

So in most courses nothing has changed in the way we grade. Suddenly coursework grades have gone up sharply. Anyone with working neurons know why, but in the best case, nothing of consequence is done. In the worst case (fortunately uncommon), there are people trusting snake oil detectors and probably unfairly failing some students. Oh, and I forgot: there are also some people who are increasing the difficulty of the coursework in line with LLMs. Which I guess more or less makes sense... Except that if a student wants to learn without using them, then they suddenly will find assignments to be out of their league.

So yeah, it's a mess.

technothrasher an hour ago | parent [-]

> Except that if a student wants to learn without using them

My son, who is a freshman at a major university in NYC, when he said to his freshman English professor that he wanted to write his papers without using AI, was told that this was "too advanced for a freshman English class" and that using AI was a requirement.

senordevnyc an hour ago | parent [-]

Now colleges will have to try and detect if you didn't use AI!

ares623 22 minutes ago | parent | prev | next [-]

Meh, today I opened twenty PRs and felt great. That's worth it to me. (/s)

https://twentyprsaday.github.io/

5 hours ago | parent | prev [-]
[deleted]