Remix.run Logo
Big Tech Killed the Golden Age of Programming(taylor.gl)
126 points by taylorlunt 3 days ago | 20 comments
zwnow 3 days ago | parent | next [-]

It also caused the "Golden Age of Programming". It's only been a golden age because of high salaries for relatively low effort. So if their needs change, obviously the industry changes. This article has nothing to say really.

lokar 3 days ago | parent | prev | next [-]

I’ve seen this claim that Google and others had some plan to over hire.

From my time there that was not the case. There was the natural demand for more people on existing projects and lots of (often good) ideas for new projects.

The money just poured in. We could never actually hire close to the approved levels. Internal “fights” were over actual people, not headcount, everyone had tons of open headcount.

I think there was just so much money, revenue growth and margin that management (which was dominated by engineers) just did not care. Fund everything and see what happens, why not?

freedomben 3 days ago | parent | prev | next [-]

This is a really terrible article. I suspect the HN comment section will be good, but TFA is not worth reading IMHO (though it is quite short so can be read in a minute or two).

> For years, companies like Google, Facebook/Meta, and Amazon hired too many developers. They knew they were hiring too many developers, but they did it anyway because of corporate greed. They wanted to control the talent pool.

Aside from all the claims with no sources/references whatsoever (claims which are not at all self-evident), blaming "corporate greed" for hiring employees? Isn't it also "corporate greed" to lay people off? Blaming corporate greed for causing high salaries? Let me guess, if they started cutting salaries, that is also corporate greed?

It's not possible to "control the talent pool" when there are so many companies in competition. Yes, they want to hire the best engineers they can find and they will pay handsomely for it. Every company (even our small non-profit) wants to hire the best engineers we can find. It's not "corporate greed" or us wanting to control the talent pool.

wiseowise 3 days ago | parent | prev | next [-]

Big Tech Killed the Golden Age of Programming by *checks notes* creating it in the first place?

nosefrog 3 days ago | parent | prev | next [-]

My first programming job in SF paid $60k/year 10 years ago. I'd like to thank big tech for driving salaries up.

mathattack 3 days ago | parent | prev | next [-]

It's supply and demand.

It could also be a 4X increase in grads for Computer and Information Sciences degrees since the 90s.

Source: https://nces.ed.gov/ipeds/Search?query=computer%20science&qu...

maxdo 3 days ago | parent | prev | next [-]

It’s a bit naïve—almost a textbook neo-western, ego-centric mindset—where everything is attributed to a brilliant personality, and all setbacks are blamed on some mysterious villains.

But in reality, market forces explain it more cleanly—think corporate priorities and shifting strategies, not just “evil managers.”

It’s simpler than it seems. In the past, growing a tech company meant building more products and features, which required more people. That’s how you scaled.

Now, in the AI era, growth often means more GPUs and a smaller, highly skilled team solving business problems.

The pattern of investment has shifted. It’s not about corporate greed—it’s about evolving models of efficiency.

throwaw12 3 days ago | parent | prev | next [-]

Big Tech is not the root cause.

Big Tech and empire builders there followed the classic business rules, additionally highly encouraged by Wall Street.

When its cheap, grow fast, when its expensive shield the bottom line, don't make risky moves and cut the fat.

People are same everywhere, you can't just put the blame on Big Tech. Other industries would do same when given opportunity

1vuio0pswjnm7 2 days ago | parent | prev | next [-]

The title is "Big Tech killed the Golden Age of Programming"

But the author spends zero time explaining why he thinks internet-based data collection and surveillance is the "Golden Age" of programming.

A golden age, according to one definition, is a period of "peak achievement"

What exactly is the author's concept of achievement

Perhaps it is financial (not programming)

For example, the period may have been noteworthy for the so-called "tech" industry's ability to pay so many salaries from zero interest loans

Historically, data shows that interest rates fluctuate over time (Can we blame Big Tech for an increase in interest rates)

With respect to _programming_ some agree that innovation, improvement, progress, was actually stalled during this period (and still is) due to Big Tech's anti-competitive practices

From this end user's perspective the software created during this period, what the author calls a period of "fake jobs", is not the pinnacle of achievement in programming

To me, software quality is at an all-time low, and this "Big Tech" period does not stand as a "Golden Age of Programming"

Compared to the software I am using originally created in the 1970s it stinks

But opinions may differ

xlbuttplug2 3 days ago | parent | prev | next [-]

As a mostly fraudulent software developer, I've always considered it a privilege to earn copious amounts of money sitting in my bedroom.

gooch 3 days ago | parent | prev | next [-]

Starts with "It's not the result of regular cycles of employment or the economy." Goes on to describe a classic business cycle.

thisisit 3 days ago | parent | prev | next [-]

I don't think the writer is aware how regular business cycle works.

It starts off with some companies in a particular industry generating great margins. Slowly more companies start joining the industry and there are still great margins. The early employees see huge jump in salaries. But with time everyone wants to pile in - both companies and people. You start seeing hype that this industry is the next big thing and you can't survive without being part of the industry. Once the industry becomes too saturated companies start exiting, people are laid off, industry services become worse etc etc. Nearly every industry goes through this boom and bust, spring and winters. Most destruction leads to a new spring.

Software has seen two springs though. First, the dotcom boom. During the dotcom boom lots of OPEX was spent on undersea cable because everyone was going to use these new fangled "websites". But after the 2000s crash the data prices crashed and it led to the 2013-2020/21 boom. We have to see where things go from here.

The same thing is happening in AI. It is started off with some companies making good margins, there are huge salaries being doled out to early AI experts and give it enough time the market will be chockful of AI related stuff and also going down. That time we can see similar commentaries about how golden age of AI was killed due to greed.

abixb 3 days ago | parent | prev | next [-]

>What happened wasn't just carelessness on the part of Big Tech. It was a power move. They wanted to monopolize talent, burned billions doing it, and then discarded those people like they were nothing. They caused the problem, and now we developers are paying for it.

Gives me "content written by a LLM" vibes -- the short sentence structure, the phrasing, etc., makes it appear to me that this is a bot generated or at least bot assisted content.

Dead internet theory in full effect.

davidw 3 days ago | parent | prev | next [-]

More than anything, I miss hacking on cool stuff without quite so much "corporate" involved.

I grew up with the open source culture of the 90ies, when people were going to change the world. And they did! Things like Linux are ubiquitous. There were certainly problems with that era: misogyny ran rampant and people could be dicks, but we were still also kind of off in our little world without the spotlight that the web and lots of huge companies built on it brought to things.

I'm no RMS and I enjoy making good money, but I'm fine with 'good' money and don't need crazy money, and miss that kind of happier, more curious era with no 6 month performance review cycle kinds of shit.

That doesn't mean ignoring business goals; I was very happy when I worked for a company doing fundus cameras in Italy just 10 years ago - that was such a smart group of people, and very oriented towards making the product the best that it could be. But there were a lot of cool things to hack on and not much to get in the way of doing that.

esafak 3 days ago | parent | prev | next [-]

Big tech g̶e̶n̶e̶r̶o̶u̶s̶l̶y̶ lavishly supported programmers for a whole generation. This is something to be happy about.

paxys 3 days ago | parent | prev | next [-]

These big tech companies are collectively hiring thousands of software engineers every week. Smaller companies and startups are hiring tens of thousands more. If you can't get a job, that's on you.

falcor84 3 days ago | parent | prev | next [-]

> They wanted to monopolize talent, burned billions doing it, and then discarded those people like they were nothing. They caused the problem, and now we developers are paying for it.

What's the actual issue here? Is anyone really worse off by having worked at FAANG for a few years and then being given a generous severance package? The alternative explicitly presented by the article is that if they hadn't been hired by FAANG, they would have been working at a smaller company for lower pay, or worse yet, they wouldn't have been able to get a coding job at all.

overstood 3 days ago | parent | prev | next [-]

Times aren’t tight, the premise of this article is flawed. Big tech is insanely profitable and investors are loving it. The cuts are not a hard necessity but a choice made for a different reason.

waldopat 3 days ago | parent | prev | next [-]

I wish I could like this post, but it unfortunately shows a lack of historical framing. So, as an elder millennial, I thought I'd backfill with some data from the 1990s onward. (I'm a bit of a management/tech history nerd as well and studied it in grad school)

TL:DR; The precarity of knowledge workers is not new and it happens every 3-5 years, though it sure feels like it's getting more common.

1991-1993: IBM laid off 120,000 white collar workers, the largest in history. AT&T and DEC also restructured.

1995-1996: Telecom and PC layoffs as labor shifted abroad and JIT management becomes dominant

2000–2002: Perhaps the first example of over hiring at the end of the 1990s (echoing the ZIRP era) and then massive layoffs with the Dot-com bust

2008–2010: Widespread layoffs across Big Tech and startups with the Great Recession.

2012–2015: Companies like Microsoft, HP, and IBM shed tens of thousands with post-mobile restructuring.

2020: Travel/service tech (Uber, Airbnb, etc.) were hit hard due to COVID shock.

2022–2024: The current wave we’re living through with Post-ZIRP and AI pivots.

If you're looking for books or articles, Gina Neff, Stephen Barley or Gideon Kunda have some of the oldest. In short, there is no real difference between then and now: Instability is hitting workers who genuinely thought they had made it.

rvz 3 days ago | parent | prev [-]

No it did not.

It killed the golden age of mediocre software developers, which includes the over inflated role of web development which that can be safely done by LLMs.