| ▲ | adampunk 5 hours ago | |
It’s worth pointing out that as of February 2026 the costs here are still pretty speculative. We’ve got some small sample studies on developers, and we have some anecdotal transmission of certain skills falling away. But frankly, if these anecdotes and limited data were attached to some statement about Rust, for example, no one would give them any credence whatsoever. What we’re working with -—unfortunately—-are vibes. It really seems as though AI coding will have this effect on people. Morally, it seems like it ought to have this effect on people. We should not be allowed to be at ease without some sort of cost. And if we can luridly suggest that you don’t pay with money all the better. This allows for the piece to perform its function, even when it doesn’t fully commit to it. A work in the genre can say all sorts of nuanced things about agentic coding, while still keeping the core premise that those who resist or position themselves strategically will be the winners. That’s possible! It’s entirely possible that we will see some skill atrophy that is broken down by AI usage AND materially impacts outcomes that matter. We for sure do not know whether or not that is the case. I suspect it is because we don’t ask what these predictions cost you, which is nothing. If we look at the starting point for most people on this stuff, it’s basically last fall. The author points this out, but the necessary conclusion one was draw from this is that we don’t have enough information to tell what the cost will be. It may like moving to programming languages from assembly or moving to assembly from bespoke instructions—-fundamentally very little was lost in those transitions, despite there being a lot of carping about it. It could be like the introduction of the tablet in American schools, where what we lose is nearly everything. We really do not know. It might be prudent to be cautious in this situation, but we ought to respect the fact that this caution might be born out of an old paradigm. | ||