| ▲ | fdefitte 6 hours ago |
| This is an underrated take. If you make someone 3x faster at producing a report nobody reads, you've improved nothing. The real gains from AI show up when it changes what work gets done, not just how fast existing work happens. Most companies are still in the "do the same stuff but with AI" phase. |
|
| ▲ | Stromgren 3 hours ago | parent | next [-] |
| And if you make someone 3x faster at producing a report that 100 people has to read, but it now takes 10% longer to read and understand, you’ve lost overall value. |
| |
| ▲ | anon-3988 2 hours ago | parent [-] | | You are forgetting that they are now going to use AI to summarize it back. | | |
| ▲ | kombookcha an hour ago | parent | next [-] | | This is one of my major concerns about people trying to use these tools for 'efficiency'. The only plausible value in somebody writing a huge report and somebody else reading it is information transfer. LLM's are notoriously bad at this. The noise to signal ratio is unacceptably high, and you will be worse off reading the summary than if you skimmed the first and last pages. In fact, you will be worse off than if you did nothing at all. Using AI to output noise and learn nothing at breakneck speeds is worse than simply looking out the window, because you now have a false sense of security about your understanding of the material. Relatedly, I think people get the sense that 'getting better at prompting' is purely a one-way issue of training the robot to give better outputs. But you are also training yourself to only ask the sorts of questions that it can answer well. Those questions that it will no longer occur to you to ask (not just of the robot, but of yourself) might be the most pertinent ones! | | |
| ▲ | notahacker 15 minutes ago | parent | next [-] | | Yep. The other way it can have net no impact is if it saves thousand of hours of report drafting and reading but misses the one salient fact buried in the observations that could actually save the company money. Whilst completely nailing the fluff. | |
| ▲ | birdsongs an hour ago | parent | prev | next [-] | | > LLM's are notoriously bad at this. The noise to signal ratio is unacceptably high I could go either way on the future of this, but if you take the argument that we're still early days, this may not hold. They're notoriously bad at this so far. We could still be in the PC DOS 3.X era in this timeline. Wait until we hit the Windows 3.1, or 95 equivalent. Personally, I have seen shocking improvements in the past 3 months with the latest models. | | |
| ▲ | kombookcha 15 minutes ago | parent | next [-] | | Personally I strongly doubt it. Since the nature of LLM's does not allow them semantic content or context, I believe it is inherently a tool unsuited for this task. As far as I can tell, it's a limitation of the technology itself, not of the amount of power behind it. Either way, being able to generate or compress loads of text very quickly with no understanding of the contents simply is not the bottleneck of information transfer between human beings. | |
| ▲ | mcny an hour ago | parent | prev [-] | | I would like to see the day when the context size is in gigabytes or tens of billions of tokens, not RAG or whatever, actual context. |
| |
| ▲ | kykeonaut an hour ago | parent | prev [-] | | > Those questions that it will no longer occur to you to ask (not just of the robot, but of yourself) might be the most pertinent ones! That is true, but then again also with google. You could see why some people want to go back to the "read the book" era where you didn't have google to query anything and had to make the real questions. |
| |
| ▲ | prmoustache an hour ago | parent | prev | next [-] | | This reminds me of that "telephone" kids game. https://en.wikipedia.org/wiki/Telephone_game | |
| ▲ | SpaceNoodled an hour ago | parent | prev [-] | | So what we now have is a very expensive and energy-intensive method for inflating data in a lossy manner. Incredible. | | |
| ▲ | amoss 9 minutes ago | parent [-] | | Remarkably it has only cost a few trillion dollars to get here! |
|
|
|
|
| ▲ | amelius 25 minutes ago | parent | prev | next [-] |
| > The real gains from AI show up when it changes what work gets done, not just how fast existing work happens. Sadly AI is only capable of doing work that has already been done, thousands of times. |
|
| ▲ | injidup 4 hours ago | parent | prev | next [-] |
| Maybe the take is that those reports that people took a day to write were read by nobody in the first place and now those reports are being written faster and more of them are being produced but still nobody reads them. Thus productivity doesn't change. The solution is to get rid of all the people who write and process reports and empower the people who actually produce stuff to do it better. |
| |
| ▲ | patrickk 42 minutes ago | parent | next [-] | | > The solution is to get rid of all the people who write and process reports and empower the people who actually produce stuff to do it better. That’s the solution if you’re the business owner. That’s definitely not the solution if you’re a manager in charge of this useless activity, in fact, you should increase the amount of reports being written as much as humanly possible. The more underlings under you= more power and prestige. This is the principal-agent problem writ large. As the comment mentioned above, also see Graeber’s Bullshit Jobs essay and book. | |
| ▲ | beAbU 2 hours ago | parent | prev | next [-] | | The managerial class are like cats and closed doors. Ofcourse they don't read the reports, who has time to read it? But don't even think about not sending the report, they like to have the option of reading it if they choose to do so. A closed door removes agency from a cat, an absent report removes agency from a manager. | |
| ▲ | laserlight an hour ago | parent | prev [-] | | > Thus productivity doesn't change. Indeed, productivity has decreased, because now there’s more output that is waste and you are paying to generate that excess waste. |
|
|
| ▲ | seanhunter an hour ago | parent | prev | next [-] |
| What happens if (and I suspect this to be increasingly the case now) you make someone 3x faster at producing a report that nobody reads and those people now use LLMs to not read the report whereas they were not reading it in person before? Then everyone saves time, which they can spend producing more things which other people will not read and/or not reading the things that other people produce (using llms)? Productivity through the roof. |
| |
| ▲ | carlosjobim an hour ago | parent | next [-] | | Now you know why GDP is higher than ever and people are poorer than ever. | |
| ▲ | nkrisc 37 minutes ago | parent | prev [-] | | Mmm I can’t wait to get home and grill up some Productivity for dinner. We’ll have so much Productivity and no jobs. Hopefully our billionaire overlords deign to feed us. |
|
|
| ▲ | jacquesm 35 minutes ago | parent | prev | next [-] |
| And the fact that you can make it 3x faster substantially increases the chances that nobody will read it in the first place. |
|
| ▲ | Lerc an hour ago | parent | prev | next [-] |
| What a load of nonsense, they won't be producing a report in a third of the time only to have no-one read it. They'll spend the same amount of time and produce a report three times the length, which will then go unread. |
|
| ▲ | wiseowise 5 hours ago | parent | prev [-] |
| Not a phase, I’d argue that 90% of modern jobs are bullshit to keep cattle occupied and economy rolling. |
| |
| ▲ | nkrisc 35 minutes ago | parent | next [-] | | You know, that would almost be fine if everyone could afford a home and food and some pleasures. | |
| ▲ | Retric 4 hours ago | parent | prev | next [-] | | Jobs you don’t notice or understand often look pointless. HR on the surface seems unimportant, but you’d notice if the company stopped having health insurance or sending your taxes to the IRS etc etc. In the end when jobs are done right they seem to disappear. We notice crappy software or a poorly done HVAC system not clean carpets. | | |
| ▲ | nkrisc 33 minutes ago | parent | next [-] | | This just highlights the absurdity of having your employer responsible for your health insurance and managing your taxes for you. These should be handled by the government, equally for all. | |
| ▲ | jdasdf 2 hours ago | parent | prev [-] | | > HR on the surface seems unimportant, but you’d notice if the company stopped having health insurance or sending your taxes to the IRS etc etc. Interesting on how the very example you give for "oh this job isn't really bullshit" ultimately ends up being useless for the business itself, and exists only as a result of regulation. No, health insurance being provided by employers, or tax withholding aren't useful things for anyone, except for the state who now offloads its costs onto private businesses. |
| |
| ▲ | yoyohello13 5 hours ago | parent | prev | next [-] | | Your claim and the claims that all white collar jobs are going to disappear in 12-18 months cannot both be true. I guess we will see. | | |
| ▲ | onion2k 4 hours ago | parent | next [-] | | It's possible to automate the pointless stuff without realising it's pointless. | | | |
| ▲ | beeflet 4 hours ago | parent | prev | next [-] | | I think they can both be true. Perhaps the innovation of AI is not that it automates important work, but because it forces people to question if the work has already been automated or is even necessary. | |
| ▲ | zzrrt 4 hours ago | parent | prev [-] | | Well, if a lot of it is bullshit that can also be done more efficiently with AI, then 99% of white collar roles could be eliminated by the 1% using AI, and essentially both were very close to true. |
| |
| ▲ | palmotea 3 hours ago | parent | prev [-] | | > Not a phase, I’d argue that 90% of modern jobs are bullshit to keep cattle occupied and economy rolling. Cattle? You actually think that about other people? | | |
| ▲ | wao0uuno 2 hours ago | parent | next [-] | | I think what he meant was that the top 1% ruling class is keeping those bullshit jobs around to keep the poor people (their cattle) occupied so they won't have time and energy to think and revolt. | | |
| ▲ | Ekaros 2 hours ago | parent [-] | | Or for everyone in chain of command to have people to rule over. A common want for many in leadership positions. At least two ways, you want to control people. And your value to your peers is the amount of people or resources you control. |
| |
| ▲ | KoftaBob an hour ago | parent | prev [-] | | It seems more like they're implying it's those at the top think that about other people. |
|
|