| ▲ | antonvs 3 hours ago | |
> LLMs can't be strategic because they do not understand the big picture -- that the real work of good software is balancing a hundred different constraints in a way that produces the optimal result for the humans who use it. There’s good reason to think that they could understand the big picture just fine, even today, except that they’re currently severely constrained by what we choose, or have time, to tell them. They can already easily give a much more comprehensive survey of suitable options for solving a given problem than most humans can. If they had more direct access to the information we have access to, that we currently grudgingly dole out to them in dribs and drabs, they would be much more capable. | ||