|
| ▲ | mgfist 36 minutes ago | parent | next [-] |
| Because that requires adoption. Devs on hackernews are already the most up to date folks in the industry and even here adoption of LLMs is incredibly slow. And a lot of the adoption that does happen is still with older tech like ChatGPT or Cursor. |
|
| ▲ | HPMOR an hour ago | parent | prev | next [-] |
| I think this is an open question still and very interesting. Ilya discussed this on the Dwarkesh podcast. But the capabilities of LLMs is clearly exponential and perhaps super exponential. We went from something that could string together incoherent text in 2022 to general models helping people like Terrance Tao and Scott Aaronson write new research papers. LLMs also beat IMO and the ICPC. We have entered the John Henry era for intellectual tasks... |
| |
| ▲ | llmslave2 37 minutes ago | parent [-] | | > But the capabilities of LLMs is clearly exponential and perhaps super exponential By what metric? |
|
|
| ▲ | viraptor an hour ago | parent | prev | next [-] |
| Writing the code itself was never the main bottleneck. Designing the bigger solution, figuring out tradeoffs, taking to affected teams, etc. takes as much time as it used to. But still, there's definitely a significant improvement in code production part in many areas. |
|
| ▲ | aoeusnth1 an hour ago | parent | prev [-] |
| It has! CLs/engineer increased by 10% this year. LLMs from late 2024 were nearly worthless as coding agents, so given they have quadrupled in capability since then (exponential growth, btw), it's not surprising to see a modestly positive impact on SWE work. Also, I'm noticing you're not explaining yourself :) |
| |
| ▲ | llmslave2 34 minutes ago | parent [-] | | Hey, I'm not the OG commentator, why do I have to explain myself! :) When Fernando Alonso (best rookie btw) goes from 0-60 in 2.4 seconds in his Aston Martin, is it reasonable to assume he will near the speed of light in 20 seconds? |
|