Remix.run Logo
ethmarks 5 hours ago

> most students will learn a lot less than say 5 years ago while the top 5% or so will learn a lot more

If we assume that AI will automate many/most programming jobs (which is highly debatable and I don't believe is true, but just for the sake of argument), isn't this a good outcome? If most parts of programming are automatable and only the really tricky parts need human programmers, wouldn't it be convenient if there are fewer human programmers but the ones that do exist are really skilled?

mgraczyk 5 hours ago | parent [-]

It's not good if you're a freshman currently starting a CS program or a teacher trying to figure out what to do

ethmarks 5 hours ago | parent | next [-]

Well, as a college student planning to start a CS program, I can tell you that it actually sounds fine to me.

And I think that teachers can adapt. A few weeks ago, my English professor assigned us an essay where we had to ask ChatGPT a question and analyze its response and check its sources. I could imagine something similar in a programming course. "Ask ChatGPT to write code to this spec, then iterate on its output and fix its errors" would teach students some of the skills to use LLMs for coding.

mgraczyk 5 hours ago | parent [-]

This is probably useful and better than nothing, but the problem is that by the time you graduate it's unlikely that reading the output of the LLM will be useful.

ethmarks 4 hours ago | parent [-]

Fair point. Perhaps I'm just too pessimistic or narrow-minded, but I don't believe that LLMs will progress to that level of capability any time soon. If you think that they will, your view makes a great deal of sense. Agree to disagree.

JumpCrisscross 5 hours ago | parent | prev [-]

> It's not good if you're a freshman currently starting a CS program

CS is the new MBA. A thoughtless path to a safe, secure job.

Cruelly, but necessarily, a society has to destroy those pathways. Otherwise, it becomes sclerotic.