Remix.run Logo
6177c40f 3 days ago

I think we're entering a world where programmers as such won't really exist (except perhaps in certain niches). Being able to program (and read code, in particular) will probably remain useful, though diminished in value. What will matter more is your ability to actually create things, using whatever tools are necessary and available, and have them actually be useful. Which, in a way, is the same as it ever was. There's just less indirection involved now.

wiml 3 days ago | parent | next [-]

We've been living in that world since the invention of the compiler ("automatic programming"). Few people write machine code any more. If you think of LLMs as a new variety of compiler, a lot of their shortcomings are easier to describe.

qwm 3 days ago | parent | next [-]

My compiler runs on my computer and produces the same machine code given the same input. Neither of these are true with AI.

wiml 2 days ago | parent [-]

You can run an LLM locally (and distributed compile systems, where the compiler runs in the cloud, are a thing, too) so that doesn't really produce a distinction between the two.

Likewise, many optimization techniques involve some randomness, whether it's approximating an NP-thorny subproblem, or using PGO guided by statistical sampling. People might disable those in pursuit of reproducible builds, but no one would claim that enabling those features makes GCC or LLVM no longer a compiler. So nondeterminism isn't really the distinguishing factor either.

bdangubic 3 days ago | parent | prev | next [-]

last thing I want is non-deterministic compiler, do not vibe this analogy at all…

moffkalast 2 days ago | parent | prev [-]

Finally we've invented a compiler that we can yell at when it gives bullshit errors. I really missed that with gcc.

pseidemann 3 days ago | parent | prev [-]

Isn't there more indirection as long as LLMs use "human" programming languages?

xarope 3 days ago | parent | next [-]

If you think of the training data, e.g. SO, github etc, then you have a human asking or describing a problem, then the code as the solution. So I suspect current-gen LLMs are still following this model, which means for the forseeable future a human like language prompt will still be the best.

Until such time, of course, when LLMs are eating their own dogfood, in which case they - as has already happened - create their own language, evolve dramatically, and cue skynet.

6177c40f 3 days ago | parent | prev | next [-]

More indirection in the sense that there's a layer between you and the code, sure. Less in that the code doesn't really matter as such and you're not having to think hard about the minutiae of programming in order to make something you want. It's very possible that "AI-oriented" programming languages will become the standard eventually (at least for new projects).

recursive 2 days ago | parent [-]

One benefit of conventional code is that it expresses logic in an unambiguous way. Much of "the minutiae" is deciding what happens in edge cases. It's even harder to express that in a human language than in computer languages. For some domains it probably doesn't matter.

layer8 3 days ago | parent | prev [-]

It’s not clear how affordances of programming languages really differ between humans and LLMs.