|
| ▲ | specproc 12 hours ago | parent | next [-] |
| I wouldn't underestimate the anxiety it's causing. Most of my social circle are non-technicial. A lot of people have had a difficult time with work recently, for various reasons. The global economic climate feels very precarious, politics is ugly, people feel powerless and afraid. AI tends to come up in the "state of the world" conversation. It's destroying friends' decade old businesses in translation, copywriting and editing. It's completely upturned a lot of other jobs, I know a lot of teachers and academics, for example. Corporate enthusiasm for AI is seen for what it actually is, a chance to cut head count. I'm into AI, I get value out of it, but decision makers need to read the room a bit better. The vibe in 2025 is angry and scared. |
| |
| ▲ | surgical_fire 10 hours ago | parent [-] | | > Corporate enthusiasm for AI is seen for what it actually is, a chance to cut head count I mean, it is the reason why the usual suspects push it so aggressively. The underclasses much always be pushed down. It's mostly bullshit, on most areas LLMs cannot reliably replace humans. But they embrace it because the chance that it might undermine labor is very seductive. | | |
| ▲ | acdha 8 hours ago | parent | next [-] | | Even if it doesn’t replace humans, simply having the prospect looming allows them to lower pay and deter people from asking for better working conditions. | |
| ▲ | spacemadness 5 hours ago | parent | prev [-] | | Instead of taking time to understand how it may be effective, a lot of leadership decided immediately that people could take on double the work they were doing or more right out the gate as long as they use AI. I’ve seen people driven to tears in this environment right now from stress due to overwork. Everyone in my org was either burnt out or soon to be. And that overwork is from managers demanding more and more without any data to back up exactly how much more is reasonable. Then you get these types on HN calling people luddites for having strong opinions and anxieties as if it’s only ever about the technology itself and not the effect it has on actual people in a cutthroat capitalist system. That’s exactly the sort of thing that brought the term “tech bro” into the limelight. |
|
|
|
| ▲ | acdha 8 hours ago | parent | prev | next [-] |
| I wouldn’t be so dismissive: “force multiplier” means job loss unless there’s a large amount of work which isn’t currently being done. As you live in a society, it really is important to think about what happens if we get the mass layoffs almost all of the executive-class are promising. There are some new jobs around the tech itself, but that doesn’t help those people unless they can land one of the new jobs – and even the most excited proponents should ask themselves who is buying their product in a world with, say, 50% fewer white collar jobs. It’s also worth noting that while our modern use of Luddite is simply “anti-technology”, there was a lot more going on there. The Napoleonic wars were savaging the economy and Luddism wasn’t just a reaction to the emergence of a new technology but even more the successful class warfare being waged by the upper class who were holding the line against weavers attempts to negotiate better terms and willing to deploy the army against the working class. High inflation and unemployment created a lot of discontent, and the machines bore the brunt of it because they were a way to strike back at the industrialists being both more exposed and a more acceptable target for anyone who wasn’t at the point of being willing to harm or kill a person. Perhaps most relevant to HN is that the weavers had previously not joined together to bargain collectively. I can’t help but think that almost everyone who said “I’m too smart to need a union” during the longest run of high-paying jobs for nerds in history is going to regret that decision, especially after seeing how little loyalty the C-suite has after years of pretending otherwise. |
| |
| ▲ | ryandrake 4 hours ago | parent [-] | | > I wouldn’t be so dismissive: “force multiplier” means job loss unless there’s a large amount of work which isn’t currently being done. I think there is a massive amount of work that's currently not being done, industry-wide. Everywhere I've worked has had something like 4-10X more work to do than staff to do it. The feature and bug backlogs just endlessly grow because there is no capacity to keep up. Companies should be able to adopt a force multiplier without losing staff: It could bring that factor down to 2-5X. The fact that layoffs are happening industry-wide shows that leadership might not even know what their workers could be doing. | | |
| ▲ | acdha an hour ago | parent [-] | | In software, there is a lot of postponed work. I was thinking of other things like how companies want to replace customer service, claims processing, etc. where efficiency improvements pretty directly translate into job losses unless their business suddenly grows significantly. |
|
|
|
| ▲ | darkwater 12 hours ago | parent | prev | next [-] |
| > Yeah. This is a bad move. AI is a human force multiplier (exponentializer?). If it's a multiplier you need to either increase the requested work to keep the same humans or reduce the humans needed if you keep the same workload.
It's not straightforward which way each business will follow. |
| |
| ▲ | csa 9 hours ago | parent [-] | | > If it's a multiplier you need to either increase the requested work to keep the same humans or reduce the humans needed if you keep the same workload. It's not straightforward which way each business will follow. I guess what you’re saying is technically true while being somewhat misleading. “Increase the requested work” is one way of saying “reduce the amount of scutwork that needs to be done”. Personally, I’m ok having less scutwork. I’m also ok letting AI do optional scutwork that falls into the “nice to have” category (e.g., creating archival information). On a personal level, I have used AI to automate a lot of required scutwork while freeing up my time to engage in higher added-value tasks. In terms of time, the biggest areas have been preliminary research, summaries, and writing drafts. Additionally, in one specific use case that I can talk about, I have helped a medical billing office prioritize and organize their work based on estimated hourly value of items processed as well as difficulty (difficult stuff was prioritized for certain times of the day). This work had been eye-balled in the past, and could be done with a decent degree of accuracy with quite a bit of time, but AI did it with higher accuracy and almost no dedicated human time. The whole office appreciated the outcomes. There are many wins like this that are available with AI, and I think many folks just haven’t found them yet. |
|
|
| ▲ | crinkly 13 hours ago | parent | prev [-] |
| Anything that is realistically a force multiplier is a person divider. At that point I would expect people to resist it. That is assuming that it is really a force multiplier which is not totally evident at this point. |
| |
| ▲ | csa 9 hours ago | parent [-] | | > That is assuming that it is really a force multiplier which is not totally evident at this point. I really think that this is a lack of imagination at this point for people who actually think this way. There are two easy wins for almost anyone: 1. Optional tasks that add value but never reach the top of the priority list. 2. Writing routine communication and documentation. > Anything that is realistically a force multiplier is a person divider. At that point I would expect people to resist it. The CEOs who are using AI as an excuse to reduce head count are not helping this narrative. AI will not solve their problems for large staff cuts. It’s just an unrelated excuse to walk back from over hiring in the past (esp. during Covid). That said, I think that framing AI as a “person divider” is baseless fear-mongering for most job categories. |
|