| ▲ | bananaflag 4 hours ago |
| Yeah but this time it's for real. All the other attempts failed because they were just mindless conversions of formal languages to formal languages. Basically glorified compilers. Either the formal language wasn't capable enough to express all situations, or it was capable and thus it was as complex as the one thing it was designed to replace. AI is different. You tell it in natural language, which can be ambiguous and not cover all the bases. And people are familiar with natural language. And it can fill in the missing details and disambiguate the others. This has been known to be possible for decades, as (simplifying a bit) the (non-technical) manager can order the engineer in natural, ambiguous language what to do and they will do it. Now the AI takes the place of the engineer. Also, I personally never believed before AI that programming will disappear, so the argument that "this has been hyped before" doesn't touch my soul. I have no idea why this is so hard to understand. I'd like people to reply to me in addition to downvoting. |
|
| ▲ | danhau 3 hours ago | parent | next [-] |
| Programmers have enjoyed an occupation with solid stability and growing opportunities. AI challenging this virtually over night is a tough pill to swallow. Naturally, many subscribe to the hope that it will fail. How far AI will succeed in replacing programmers remains to be seen. Personally I think many jobs will disappear, especially in the largest domains (web). But I think this will only be a fraction and not a majority. For now, AI is simply most useful when paired with a programmer. |
| |
| ▲ | aleph_minus_one an hour ago | parent | next [-] | | >
Programmers have enjoyed an occupation with solid stability and growing opportunities. This is not the case: - Before the 90s, programming was rather a job for people who were insanely passionate about technology, and working as a programmer was not that well-regarded (so no "growing opportunities"). - After the burst of the first dotcom bubble, a lot of programmers were unemployed. - Every older programmer can tell you how fast the skills that they have can become and became irrelevant. Over the last decade, the stability and opportunities for programmers was more like a series of boom-bust cycles. | |
| ▲ | cafebabbe 2 hours ago | parent | prev [-] | | AI is useful when paired with an experienced programmer. Experienced through old-school (pre-LLM) practice. I don't clearly see a good endgame for this. | | |
| ▲ | citrin_ru an hour ago | parent | next [-] | | Endgame is to produce AI which will not need any supervision by the time the current generation of experienced developers will retire or even sooner. I don’t know if it will happen but many bet on this and models are still improving, flattening is not yet seen. | | |
| ▲ | ajshahH 7 minutes ago | parent [-] | | This implies programming is done and there will be no other advancements. And flattening is being seen, no? Recent advancements are mostly from RL’ing, which has limitations (and tradeoffs) too. Are there more tricks after that? |
| |
| ▲ | duggan 2 hours ago | parent | prev [-] | | Motivated novices will just learn differently, and produce different kinds of systems for different audiences with different expectations. Some will dig into obscurities that LLMs don't or can't touch, others will orchestrate the tools, Gastown-style, into some as-yet-unknown form. People will vibe themselves into a corner and either start learning or flame out. |
|
|
|
| ▲ | t_mahmood an hour ago | parent | prev | next [-] |
| A manager is not going to handle all the nitty gritty details, that an engineer knows, fine say, they can ask a LLM to make a web portal. Does he know about SQL injection? XSS? Maybe he knows slightly about security stuffs and asks the LLM to make a secure site with all the protection needed. But how the manager knows it works at all? If you figure out there's a issue with your critical part of the software, after your users data are stolen, how bad the fallback is going to be? How good a tool is also depends on who's using it. Managers are not engineers obviously unless he was
an engineer before becoming a manager, but you are saying engineers are not needed. So, where's the engineer manager is going to come from? I'm sure we're not growing them in some engineering trees |
| |
| ▲ | skydhash an hour ago | parent [-] | | It's like saying "I want a bridge" and then expect steel beams and cables to appear (or planks and ropes) and that's all you need. The user needs are usually clear enough (they need a way to cross that body of water or that chasm), but the how is the real catch. In the real world, the materials are visible so people have a partial understanding on how it gets done. But most of the software world is invisible and has no material constraints other than the hardware (you can't use RAM that is not there). If the hardware is like a blank canvas, a standard web framework is like a draw by the numbers book (but one with lines drawn by a pencil so you can erase it easily). Asking the user to code with LLM is like asking a blind to draw the Mona Lisa with a brick. |
|
|
| ▲ | empath75 an hour ago | parent | prev | next [-] |
| I spent the last two weeks at work building a whole system to deploy automated claude code agents in response to events and even before i finished it was already doing useful work and now it is automatically handling jira tickets and making PRs. |
|
| ▲ | quotemstr 2 hours ago | parent | prev [-] |
| The thing about talking to computers is less the formality and more the specificity. People don't know what they want. To use an LLM effectively, you need to think about what you want with enough clarity to ask for it and check that you're getting it. That LLMs accept your wishes in the form of natural language instead of something with a LALR(1) grammar doesn't magically obviate the need for specificity and clarity in communication. |
| |
| ▲ | bananaflag 2 hours ago | parent [-] | | Agree that one needs clarity, but how does that differ from my example with the manager and the engineer? The manager also (ideally) learns in time that, when they are more clear, the engineer does the work better. | | |
| ▲ | elasticeel 39 minutes ago | parent | next [-] | | Do they though? Our do they learn that having a good engineer means they can assign ambiguous tasks and the software developer can reason through good decision making and follow up with clarifying questions. LLMs need to get better at asking clarifying questions and trying to show the initial solution might not work. Even when they get better at that, this article states that managers not capable of thinking through the answers well enough will fall short and this is the space that developers live in. | |
| ▲ | skydhash an hour ago | parent | prev [-] | | TLDR: Clarity in software engineering means detailing all the constraints, which no user (apart from lawyers and engineers) usually do, as the real world has constraints that software does not. The hardware offers so little guarantees that the whole OS job is to offer that. All layers are formal, but usefulness doesn't comes from that. Usefulness comes from a consistent models that embodies a domain. So you have the hardware that has capabilities but no model. Then you add the OS's kernel that will impose a model on the hardware, then you have the system libraries that will further restrict it to a certain domains. Then you have the general libraries that are more useful because they present another perspective. And then you have the application that use this last model according to a certain need. A good example is that you go from the sound card to the sound subsystem, the the alsa libraries, to pipewire, to an audio player or a media framework like the one in the browser. This particular tower has dozens of engineers that has contributed to it, and most developers only deal with the last layers, but the lesson is that the perspective of a user differs from the building blocks that we have in hand. Software engineering is to reconcile the twos. So people may know how the things should look or behave on their hand, but they have no idea on what the building blocks on the other hand. It's all abstract. The only thing real is the hardware and the energy powering it. Everything else needs to be specified with code. And in that world that forms the middle layer, there's a lot of rules to follow to make something good, but laws that prevent something bad are little. It's not like physical engineering where there are things you just cannot do. Just like on a canvas you can draw anything as long as it's inside the boundary of the canvas, you can do anything in software as long as it's inside the boundary of the hardware. OS in personal computers adds a little more restrictions, but it's not a lot. It's basically fantasia in there. |
|
|