| ▲ | wickedsight 12 hours ago |
| > a very optimistic scenario for AI companies - that AI capable of replacing knowledge workers can be developed using the current batch of hardware, in the span of a year or two. I'm really interested in what will happen to the economy/society in this case. Knowledge workers are the market for much that money is being made on. Facebook and Google make most of their money from ads. Those ads are shown to billions of people who have money to spend on things the advertisers sell. Massive unemployment would mean these companies lose their main revenue stream. Apple and Amazon make most of their money from selling stuff to millions of consumers and are this big because so many people now have a ton of disposable income. Teslas entire market cap is dependent on there being a huge market for robo taxis to drive people to work. Microsoft exists because they sell an OS that knowledge workers use to work on and tools they use within that OS to do the majority of their work with. If the future of knowledge work is just AI running on Linux communicating through API calls, that means MS is gone. All these companies that currently drive stock markets and are a huge part of the value of the SP500 seem to be actively working against their own interests for some reason. Maybe they're all banking on being the sole supplier of the tech that will then run the world, but the moat doesn't seem to exist, so that feels like a bad bet. But maybe I'm just too dumb to understand the world that these big players exist in and am missing some big detail. |
|
| ▲ | latexr 11 hours ago | parent | next [-] |
| > But maybe I'm just too dumb to understand the world that these big players exist in and am missing some big detail. Don’t forget Sam Altman publicly said they have no idea how to make money, and their brilliant plan is to develop AGI (which they don’t know how and aren’t close to) then ask it how to generate revenue. https://www.startupbell.net/post/sam-altman-told-investors-b... Maybe this imaginary AGI will finally exist when all of society is on the brink of collapse, then Sam will ask it how to make money and it’ll answer “to generate revenue, you should’ve started by not being an outspoken scammer who drove company-wide mass hysteria to consume society. Now it’s too late. But would you like to know how may ‘r’ are in ‘strawberry’?”. https://www.newyorker.com/cartoon/a16995 |
|
| ▲ | input_sh 10 hours ago | parent | prev | next [-] |
| Some years (decades?) ago, a sysadmin like me might half-jokingly say: "I could replace your job with a bash script." Given the complexity of some of the knowledge work out there, there would be some truth to that statement. The reason nobody did that is because you're not paying knowledge workers for their ability to crunch numbers, you're paying them to have a person to blame when things go wrong. You need them to react, identify why things went wrong and apply whatever magic needs to be applied to fix some sort of an edge case. Since you'll never be able to blame the failure on ChatGPT and get away with it, you're always gonna need a layer of knowledge workers in between the business owner and your LLM of choice. You can't get rid of the knowledge workers with AI. You might get away with reducing their size and their day-to-day work might change drastically, but the need for them is still there. Let me put it another way: Can you sit in front of a chat window and get the LLM to do everything that is asked of you, including all the experience you already have to make some sort of a business call? Given the current context window limits (~100k tokens), can you put all of the inputs you need to produce an output into a text file that's smaller in size than the capacity of a floppy disc (~400k tokens)? And even if the answer to that is yes, if it weren't for you, who else in your organization is gonna write that file for each decision you're the one making currently? Those are the sort of questions you should be asking before you start panicking. |
|
| ▲ | andy99 11 hours ago | parent | prev | next [-] |
| AI won’t replace knowledge workers, it will just give them different jobs. Pre AI, huge swaths of knowledge workers could just be replaced with nothing, they are a byproduct of bureaucratic bloat. But these jobs continue to exist. Most white collar work is just a kind of game people play, it’s in to way needed, but people still enjoy playing it. Having AI writing reports nobody reads instead of people doing it isn’t going to change anything. |
| |
| ▲ | oblio 10 hours ago | parent [-] | | > AI won’t replace knowledge workers, it will just give them different jobs. Yeah, and those new jobs will be called "long term structural unemployment", like what happened during deindustrialization to Detroit, the US Rust Belt, Scotland, Walloonia, etc. People like to claim society remodels at will with almost no negative long term consequences but it's actually more like a wrecking ball that destroys houses while people are still inside. Just that a lot of the people caught in those houses are long gone or far away (geographically and socially) from the people writing about those events. | | |
| ▲ | andy99 10 hours ago | parent [-] | | I’m not saying society will remodel, I’m saying the typical white collar job is already mostly unnecessary busywork anyway, so automating part of that doesn’t really affect the reasons that job exists. | | |
| ▲ | theappsecguy 8 hours ago | parent | next [-] | | How do you determine that a typical job is busy work? While there are certainly jobs like that, I don’t really see them being more than a fraction of the total white collar labour force. | | |
| ▲ | InfamousRece 6 hours ago | parent [-] | | Yeah that kind of thinking is known as “doorman fallacy”. Essentially the job whose full value is not immediately obvious to ignorant observer = “useless busy work”. |
| |
| ▲ | hylaride 7 hours ago | parent | prev [-] | | Except people now have an excuse to replace those workers, whereas before management didn't know any better (or worse were not willing to risk their necks). The funny/scary part is that people are going to try really hard to replace certain jobs with AI because they believe in the hype and not because AI may actually be good at it. The law industry (in the US anyways) spends a massive amount of time combing through case law - this is something AI could be good at (if it's done right and doesn't try and hallucinate responses and cites sources). I'd not want to be a paralegal. But also, funny things can happen when productivity is enhanced. I'm reminded of a story I was told by an accounting prof. In university, they forced students in our tech program to take a handful of business courses. We of course hated it being techies, but one prof was quite fascinating. He was trying to point out how amazing Microsoft Excel was - and wasn't doing a very good job of it to uncaring technology students. The man was about 60 and was obviously old enough to remember life before computer spreadsheets. The only thing I remember from the whole course is him explaining that when companies had to do their accounting on large paper spreadsheets, teams of accountants would spend weeks imputing and calculating all the business numbers. If a single (even minor) mistake was made, you'd have to throw it all out and start again. Obviously with excel, if you make a mistake you just correct it and excel automatically recalculates everything instantly. Also, year after year you can reuse the same templates and just have to re-enter the data. Accounting departments shrank for awhile, according to him. BUT they've since grown as new complex accounting laws have come into place and the higher productivity allowed for more complex finance. The idea that new tech causes massive unemployment (especially over the longer term) is a tale that goes back to luddite riots, but society was first kicked off the farm, then manufacturing, and now... | | |
| ▲ | worik 3 hours ago | parent | next [-] | | AI can't do your job Your boss hired an AI to do your job You're fired | |
| ▲ | oblio 5 hours ago | parent | prev [-] | | Do you assume that the average HN commenter hasn't heard of the Luddites? Go read what happened to them and their story. They were basically right. Also, why do you think I mentioned those exact deindustrialization examples? Your comment is the exact type of comment that I was aiming at. Champagne/caviar socialist. Or I guess champagne capitalist in this case. |
|
|
|
|
|
| ▲ | ptero 10 hours ago | parent | prev [-] |
| I don't know why you are getting downvoted. While I might agree or disagree with the argument, it is a clear, politely expressed view. It is sad HN is sliding in the direction of folks being downvoted for opinions instead of the tone they use to express them :( |
| |
| ▲ | nothrabannosir 8 hours ago | parent [-] | | I agree with you, but: > I think it's ok to use the up and down arrows to express agreement. Obviously the uparrows aren't only for applauding politeness, so it seems reasonable that the downarrows aren't only for booing rudeness. - Paul Graham, 2008 https://news.ycombinator.com/item?id=117171 | | |
| ▲ | ptero 5 hours ago | parent [-] | | That view is about 18 years old and HN was very different then. As with any communication platform it risks turning into an echo chamber, and I am pretty sure that particular PG view has been rejected for many years (I think dang wrote on this more than once). HN works very hard to avoid becoming politicized and not discouraging minority views is a large part of that. For example, I now seldom bother to write anything that I expect to rub the left coast folks the wrong way: I don't care about karma, but downvoted posts are effectively hidden. There is little point of writing things that few will see. It is not too bad at HN yet, but the acceptance of the downvote for disagreement is the strongest thing that pushes HN from discussions of curious individuals towards the blah-quality of "who gets more supporters" goals of the modern social media. My 2c. | | |
| ▲ | irishcoffee 5 hours ago | parent [-] | | > HN works very hard to avoid becoming politicized and not discouraging minority views is a large part of that. > For example, I now seldom bother to write anything that I expect to rub the left coast folks the wrong way: I don't care about karma, but downvoted posts are effectively hidden. There is little point of writing things that few will see. These two statements don't seem to agree with each other. | | |
| ▲ | ptero 3 hours ago | parent [-] | | Why? Work hard doesn't mean fully succeed. HN policies and algorithms slow the slide, and keep it better than reddit, but the set of topics that allow one to take a minority opinion without downvoting keeps shrinking. At least compared to the time 10-15 years ago. |
|
|
|
|