Remix.run Logo
MontyCarloHall 8 hours ago

It's not so much about replacing developers, but rather increasing the level of abstraction developers can work at, to allow them to work on more complex problems.

The first electronic computers were programmed by manually re-wiring their circuits. Going from that to being able to encode machine instructions on punchcards did not replace developers. Nor did going from raw machine instructions to assembly code. Nor did going from hand-written assembly to compiled low-level languages like C/FORTRAN. Nor did going from low-level languages to higher-level languages like Java, C++, or Python. Nor did relying on libraries/frameworks for implementing functionality that previously had to be written from scratch each time. Each of these steps freed developers from having to worry about lower-level problems and instead focus on higher-level problems. Mel's intellect is freed from having to optimize the position of the memory drum [0] to allow him to focus on optimizing the higher-level logic/algorithms of the problem he's solving. As a result, software has become both more complex but also much more capable, and thus much more common.

(The thing that distinguishes gen-AI from all the previous examples of increasing abstraction is that those examples are deterministic and often formally verifiable mappings from higher abstraction -> lower abstraction. Gen-AI is neither.)

[0] http://catb.org/jargon/html/story-of-mel.html

falloutx 3 hours ago | parent | next [-]

> It's not so much about replacing developers, but rather increasing the level of abstraction developers can work at, to allow them to work on more complex problems.

Thats not the goal the Anthropic's CEO has. Nor does any other CEO for that matter.

SkiFire13 8 hours ago | parent | prev | next [-]

> It's not so much about replacing developers, but rather increasing the level of abstraction developers can work at, to allow them to work on more complex problems.

People do and will talk about replacing developers though.

MontyCarloHall 8 hours ago | parent [-]

Were many of the aforementioned advancements marketed as "replacing developers"? Absolutely. Did that end up happening? Quite the opposite; each higher-level abstraction only caused the market for software and demand for developers to grow.

That's not to say developers haven't been displaced by abstraction; I suspect many of the people responsible for re-wiring the ENIAC were completely out of a job when punchcards hit the scene. But their absence was filled by a greater number of higher-level punchcard-wielding developers.

Palomides 7 hours ago | parent [-]

the infinite-fountain-of-software machine seems more likely to replace developers than previous innovations, and the people pushing the button will not be, in any current sense of the word, programming

fn-mote 6 hours ago | parent [-]

You absolutely need to be trying to accomplish these things personally to understand what is/will be easy and where the barriers.

Recognizing the barriers & modes of failure (which will be a moving target) lets you respond competently when you are called. Raise your hourly rate as needed.

ori_b 7 hours ago | parent | prev | next [-]

The goal of AI companies is to replace all intellectual labor. You can argue that they're going to fail, but it's very clear what the actual goal is.

billy99k 4 hours ago | parent [-]

One of my clients is an AI startup in the security industry. Their business model is to use AI agents to perform the initial assessment and then cut the security contractors hours by 50% to complete the job.

I don't think AI will completely replace these jobs, but it could reduce job numbers by a very large amount.

smj-edison 7 hours ago | parent | prev | next [-]

I think one thing I've heard missing from discussions though is that each level of abstraction needs to be introspectable. LLMs get compared to compilers a lot, so I'd like to ask: what is the equivalent of dumping the tokens, AST, SSA, IR, optimization passes, and assembly?

That's where I find the analogy on thin ice, because somebody has to understand the layers and their transformations.

fn-mote 6 hours ago | parent [-]

“Needs to be” is a strong claim. The skill of debugging complex problems by stepping through disassembly to find a compiler error is very specialized. Few can do it. Most applications don’t need that “introspection”. They need the “encapsulation” and faith that the lower layers work well 99.9+% of the time, and they need to know who to call when it fails.

I’m not saying generative AI meets this standard, but it’s different from what you’re saying.

smj-edison 6 hours ago | parent [-]

Sorry, I should clarify: it's needs to be introspectable by somebody. Not every programmer needs to be able to introspect the lower layers, but that capability needs to exist.

Now I guess you can read the code an LLM generates, so maybe that layer does exist. But, that's why I don't like the idea of making a programming language for LLMs, by LLMs, that's inscrutable by humans. A lot of those intermediate layers in compilers are designed for humans, with only assembly generation being made for the CPU.

tosapple 3 hours ago | parent [-]

This is a good point but may be moot. Our consumer-facing LLMs speak C, Python, and JavaScript.

'Decompilers' are work in the machine code direction for human consumption, they can be improved by LLMs.

Militarily, you will want machine code and JS capable systems.

Machine code capablities cover both memory leaks and firmware dumps and negate the requirement of "source" comprehension.

I wanted to +1 you but I don't think I have the karma required.

tosapple 3 hours ago | parent [-]

Also, smuggling a single binary out of a set of systems is likely far easier than targetting a source code repository or devbox directly.

AndrewKemendo 8 hours ago | parent | prev [-]

I think the thing that’s so weird to me is this idea that we have to all somehow internalize the concept of transistor switching as the foundational unchangeable root of computing and therefore anything that is too far abstract from that is not somehow real computing or something mess like that

Again ignoring completely that when you would program vacuum tube computers it was an entirely different type of abstraction than you do with Mosfets for example

I’m finding myself in the position where I can safely ignore any conversation about engineering with anybody who thinks that there is a “right” way to do it or that there’s any kind of ceremony or thinking pattern that needs to stay stable

Those are all artifacts of humans desiring very little variance and things that they’ve even encoded because it takes real energy to have to reconfigure your own internal state model to a new paradigm