▲ | ben_w 3 days ago | |||||||||||||||||||||||||
That goes too far in the opposite direction. Humans come with a broad range of skills and performance; LLMs are inside this range. The fact LLMs are not human, and the fact that the best humans beat them, is as economically relevant as the fact that a ride-on lawnmower isn't human and (typically) an athlete can outrace them — i.e. it resolves to what you're actually using them for. | ||||||||||||||||||||||||||
▲ | zelphirkalt 3 days ago | parent [-] | |||||||||||||||||||||||||
But it is not merely the best humans. Any good developer is able to write better code, because by definition LLMs tend towards the mean, which is mediocre code, mostly from GitHub, they were force-fed as training data. They may excel at solving very narrow problems with decent results, like in that programming competition recently. But those are indeed very narrowly defined problems, and while they may solve it decently in limited time, that is roughly their overall limit, while a human, given more time, can excel to a much higher level. It becomes a question of whether we want mediocre things, that are not very extensible and maintainable, relying on the very thing that produced these mediocre codes to maintain and extend them, or do we want high quality work. For the latter one would want to hire qualified people. Too bad though, that hiring is broken at many companies and they don't recognize qualifications, when right in front of them. | ||||||||||||||||||||||||||
|