Remix.run Logo
khaledh 6 hours ago

In the 1950s, computers were starting to go mainstream and everyone panicked that they'll lose their job due to "automation". Some jobs were lost for sure, but so many other jobs were created that computers "demanded".

The same thing happened in the late 50s / early 60s when high-level programming languages and compilers started to appear. Almost all software at that time was hand-written assembly. Compilers took a decade to reach the same quality (sometimes even better) of hand-written assembly. Programmers adapted and started thinking at a higher level of abstraction.

Another example is virtual memory. Up until the late 1960s most software used manual physical memory management techniques (mainly overlays) to decide which part of the program should reside in memory at certain points of time. Everyone was skeptical and thought that virtual memory would be less optimal than manually-designed overlays. There was a lot of research in that area during the time to prove that virtual memory can have the same or even better performance than hand-rolled approaches.

The point is: AI may well be disruptive to the way we develop software, but we're still in a transition phase where trust in AI output is very shaky. It's impressive, but it hasn't established itself yet as an abstraction we can trust and build on without thinking about what it's producing. It will take time, and humans will always have more work to do no matter how technology advances.