Remix.run Logo
1attice a day ago

I assume this is about AI, but I'd like to set the lens a bit wider than that.

Since the eighties, software (as problem/solution space) has expanded to fit the hardware available. At first we scaled with the density of transistors, and then, in the nineties, we scaled with the unfolding multitude of Internet-enabled use cases. Then, in the late aughts, we scaled again, this time with the unfolding multitude of mobile and embedded use cases.

Think of it as a sort of hot-air balloon -- each of these initial spurts is kind of like the guy in the balloon pulling the rope to heat the air; the balloon goes up. And each time we got to pull that rope, we got to go higher, and explore more and more of the problem/solution space, transforming the world as we went.

The thing is, hot air balloons can only rise so high -- they have a contextual window that includes the lower and middle parts of Earth's atmosphere. But you can't get a hot-air balloon into orbit -- the context window gives out; you'd need to go back to earth and start from scratch (maybe build a rocket.)

What I find striking about AI isn't that it can replace all or most coders, it's that (to continue the metaphor) it makes the existing air in the balloon more buoyant -- there is less need for the guy pulling the rope to pull that rope (so if your job is rope-pulling-guy, watch out.) Yet -- and this is the crucial part -- AI is being _sold_ as if it's _more atmosphere_. I.e., someplace for both money and talent to _go_; a reason to keep pulling that rope.

For a while, it looked as if it was both -- a way to pull on the rope less frequently, and _also_ a new problem/solution space to go with the ol' balloon basket. But I'd reckon the _excitement_ about AI largely had to do with the second interpretation.

Now the picture is less certain. For some activities, AI still seems genuinely revelatory/apocalyptic, depending on which side of the manager/labour dyad you fall. Yet recent studies (frequently alluded to here on HN already, scroll around) seem to show limited bottom-line benefit for a lot of use cases. This might mean there is UX work to be done, or it might mean that we're bumping up against the top of the balloon's useful range, skidding along the ceiling of the problem/solution space.

So, ironically, if AI turns out to be very useful, in the same way that word processing, email, and maps all proved useful, programming sticks around as a lucrative profession, modulo some changes in how we market and think about ourselves as engineers. We will use AI to build the new AI things that people want.

But if, for whatever reason, AI turns out to be less of a big deal than it initially seemed, then we rope-pullers are now in a situation where there is less need to pull that rope, as there is nowhere left to go (and worse, the balloon is simply more buoyant now, and needs fewer rope-pullers).

So if AI is a big deal, the party continues; if AI is _not_ a big deal, it's time to get other skills, as a stagnant market leaves the investment capital nowhere to go and our skill-set becomes commoditized. (Time to get a union.)