| ▲ | KaiserPro a day ago |
| > AI has started to take jobs, but has also created new ones. Yeah nah, theres a key thing missing here, the number of jobs created needs to be more than the ones it's destroyed, and they need to be better paying and happen in time. History says that actually when this happens, an entire generation is yeeted on to the streets (see powered looms, Jacquard machine, steam powered machine tools) All of that cheap labour needed to power the new towns and cities was created by automation of agriculture and artisan jobs. Dark satanic mills were fed the decedents of once reasonably prosperous crafts people. AI as presented here will kneecap the wages of a good proportion of the decent paying jobs we have now. This will cause huge economic disparities, and probably revolution. There is a reason why the royalty of Europe all disappeared when they did... So no, the stock market will not be growing because of AI, it will be in spite of it. Plus china knows that unless they can occupy most of its population with some sort of work, they are finished. AI and decent robot automation are an existential threat to the CCP, as much as it is to what ever remains of the "west" |
|
| ▲ | kypro a day ago | parent | next [-] |
| > and probably revolution I theorise that revolution would be near-impossible in post-AGI world. If people consider where power comes from it's relatively obvious that people will likely suffer and die on mass if we ever create AGI. Historically the general public have held the vast majority of power in society. 100+ years ago this would have been physical power – the state has to keep you happy or the public will come for them with pitchforks. But in an age of modern weaponry the public today would be pose little physical threat to the state. Instead in todays democracy power comes from the publics collective labour and purchasing power. A government can't risk upsetting people too much because a government's power today is not a product of its standing army, but the product of its economic strength. A government needs workers to create businesses and produce goods and therefore the goals of government generally align with the goals of the public. But in an post-AGI world neither businesses or the state need workers or consumers. In this world if you want something you wouldn't pay anyone for it or workers to produce it for you, instead you would just ask your fleet of AGIs to get you the resource. In this world people become more like pests. They offer no economic value yet demand that AGI owners (wherever publicly or privately owned) share resources with them. If people revolted any AGI owner would be far better off just deploying a bioweapon to humanely kill the protestors rather than sharing resources with them. Of course, this is assuming the AGI doesn't have it's own goals and just sees the whole of humanely as nuance to be stepped over in the same way humans will happy step over animals if they interfere with our goals. Imo humanity has 10-20 years left max if we continue on this path. There can be no good outcome of AGI because it would even make sense for the AGI or those who control the AGI to be aligned with goals of humanity. |
| |
| ▲ | Centigonal 20 hours ago | parent | next [-] | | I think "resource curse" countries are a great surrogate for studying possible future AGI-induced economic and political phenomena. A country like the UAE (oil) or Botswana (diamonds) essentially has an economic equivalent to AGI: they control a small, extremely productive utility (an oilfield or a mine instead of a server farm), and the wealth generated by that utility is far in excess of what those countries' leaders need to maintain power. Sure, you hire foreign labor and trade for resources instead of having your AGI supply those things, but the end result is the same. | |
| ▲ | robinhoode a day ago | parent | prev | next [-] | | > In this world people become more like pests. They offer no economic value yet demand that AGI owners (wherever publicly or privately owned) share resources with them. If people revolted any AGI owner would be far better off just deploying a bioweapon to humanely kill the protestors rather than sharing resources with them. This is a very doomer take. The threats are real, and I'm certain some people feel this way, but eliminating large swaths of humanity is something dicatorships have tried in the past. Waking up every morning means believing there are others who will cooperate with you. Most of humanity has empathy for others. I would prefer to have hope that we will make it through, rather than drown in fear. | | |
| ▲ | 758597464 a day ago | parent | next [-] | | > This is a very doomer take. The threats are real, and I'm certain some people feel this way, but eliminating large swaths of humanity is something dicatorships have tried in the past. Tried, and succeeded in. In times where people held more power than today. Not sure what point you're trying to make here. > Most of humanity has empathy for others. I would prefer to have hope that we will make it through, rather than drown in fear. I agree that most of humanity has empathy for others — but it's been shown that the prevalence of psychopaths increases as you climb the leadership ladder. Fear or hope are the responses of the passive. There are other routes to take. | | |
| ▲ | bamboozled 15 hours ago | parent [-] | | Basically why open source everything is increasingly more important and imo already making “AI” safer. If the many have access to the latest AI then there is less chance the masses are blindsided by some rogue tech. |
| |
| ▲ | 542354234235 11 hours ago | parent | prev [-] | | >but eliminating large swaths of humanity is something dicatorships have tried in the past. Technology changes things though. Things aren't "the same as it ever was". The Napoleonic wars killed 6.5 million people with muskets and cannons. The total warfare of WWII killed 70 to 85 million people with tanks, turboprop bombers, aircraft carriers, and 36 kilotons TNT of Atomic bombs, among other weaponry. Total war today includes modern thermonuclear weapons. In 60 seconds, just one Ohio class submarine can launch 80 independent warheads, totaling over 36 megatons of TNT. That is over 20 times more than all explosives, used by all sides, for all of WWII, including both Atomic bombs. AGI is a leap forward in power equivalent to what thermonuclear bombs are to warfare. Humans have been trying to destroy each other for all of time but we can only have one nuclear war, and it is likely we can only have one AGI revolt. | | |
| ▲ | jplusequalt 9 hours ago | parent [-] | | I don't understand the psychology of doomerism. Are people truly so scared of these futures they are incapable of imagining an alternate path where anything less than total human extinction occurs? Like if you're truly afraid of this, what are you doing here on HN? Go organize and try to do something about this. | | |
| ▲ | 542354234235 7 hours ago | parent [-] | | I don’t see it as doomerism, just realism. Looking at the realities of nuclear war shows that it is a world ending holocaust that could happen by accident or by the launch of a single nuclear ICBM by North Korea, and there is almost no chance of de-escalation once a missile is in the air. There is nothing to be done, other than advocate of nuclear arms treaties in my own country, but that has no effect on Russia, China, North Korea, Pakistan, India, or Iran. Bertrand Russell said, "You may reasonably expect a man to walk a tightrope safely for ten minutes; it would be unreasonable to do so without accident for two hundred years." We will either walk the tightrope for another 100 years or so until global society progresses to where there is nuclear disarmament, or we won’t. It is the same with Gen AI. We will either find a way to control an entity that rapidly becomes orders of magnitude more intelligent than us, or we won’t. We will either find a way to prevent the rich and powerful from controlling a Gen AI that can build and operate anything they need, including an army to protect them from everyone without a powerful Gen AI, or we won’t. I hope for a future of abundance for all, brought to us by technology. But I understand that some existential threats only need to turn the wrong way once, and there will be no second chance ever. | | |
| ▲ | jplusequalt 7 hours ago | parent [-] | | I think it's a fallacy to equate pessimistic outcomes with "realism" >It is the same with Gen AI. We will either find a way to control an entity that rapidly becomes orders of magnitude more intelligent than us, or we won’t. We will either find a way to prevent the rich and powerful from controlling a Gen AI that can build and operate anything they need, including an army to protect them from everyone without a powerful Gen AI, or we won’t Okay, you've laid out two paths here. What are *you* doing to influence the course we take? That's my point. Enumerating all the possible ways humanity faces extinction is nothing more than doomerism if you aren't taking any meaningful steps to lessen the likelihood any of them may occur. |
|
|
|
| |
| ▲ | wkat4242 a day ago | parent | prev | next [-] | | > I theorise that revolution would be near-impossible in post-AGI world. If people consider where power comes from it's relatively obvious that people will likely suffer and die on mass if we ever create AGI. I agree but for a different reason. It's very hard to outsmart an entity with an IQ in the thousands and pervasive information gathering. For a revolution you need to coordinate. The Chinese know this very well and this is why they control communication so closely (and why they had Apple restrict AirDrop). But their security agencies are still beholden to people with average IQs and the inefficient communication between them. An entity that can collect all this info on its own and have a huge IQ to spot patterns and not have to communicate it to convince other people in its organisation to take action, that will crush any fledgling rebellion. It will never be able to reach critical mass. We'll just be ants in an anthill and it will be the boot that crushes us when it feels like it. | |
| ▲ | jplusequalt 9 hours ago | parent | prev | next [-] | | The apathy spewed by doomers actively contributes to the future they whine about. Join a union. Organize with real people. People will always have the power in society. | |
| ▲ | weatherlite 11 hours ago | parent | prev | next [-] | | > In this world people become more like pests. They offer no economic value yet demand that AGI owners (wherever publicly or privately owned) share resources with them. If people revolted any AGI owner would be far better off just deploying a bioweapon to humanely kill the protestors rather than sharing resources with them. That will be quite a hard thing to pull off, even for some evil person with a AGI. Let's say Putin gets AGI and is actually evil and crazy enough to try wipe people out. If he just targets Russians and starts killing millions of people daily with some engineered virus or something similar, he'll have to fear a strike from the West which would be fearful they're next (and rightfully so).
If he instead tries to wipe out all of humanity at once to escape a second strike, he again will have to devise such a good plan there won't be any second strike - meaning his "AGI" will have to be way better than all other competing AGIs (how exactly?). It would have made sense if all "owners of AGI" somehow conspired together to do this but there's not really such a thing as owners of AGI and even if there was Chinese, Russian and American owners of AGI don't trust each other at all and are also bound to their governments. | |
| ▲ | dovin 18 hours ago | parent | prev [-] | | Dogs offer humans no economic value, but we haven't genocided them. There are a lot of ways that we could offer value that's not necessarily just in the form of watts and minerals. I'm not so sure that our future superintelligent summoned demons will be motivated purely by increasing their own power, resources, and leverage. Then again, maybe they will. Thus far, AI systems that we have created seem surprisingly goal-less. I'm more worried about how humans are going to use them than some sort of breakaway event but yeah, don't love that it's a real possible future. | | |
| ▲ | chipsrafferty 18 hours ago | parent [-] | | A world in which most humans fill the role of "pets" of the ultra rich doesn't sound that great. | | |
| ▲ | dovin 18 hours ago | parent [-] | | Humans becoming domesticated by benevolent superintelligences are some of the better futures with superintelligences, in my mind. Iain M Banks' Culture series is the best depiction of this I've come across; they're kind of the utopian rendition of the phrase "all watched over by machines of loving grace". Though it's a little hard to see how we get from here to there. | | |
| ▲ | autumnstwilight 17 hours ago | parent [-] | | Honestly that part of the article and some other comments have given me the idle speculation, what if that was the solution to the, "Humans no longer feel they can meaningfully contribute to the world," issue? Like we can satisfy the hunting and retrieval instincts of dogs by throwing a stick, surely an AI that is 10,000 times more intelligent can devise a stick-retrieval-task for humans in a way that feels like satisfying achievement and meaningful work from our perspective. (Leaving aside the question of whether any of that is a likely or desirable outcome.) | | |
| ▲ | bamboozled 15 hours ago | parent [-] | | What will AI find fulfilling itself? I find that to be quite a deep question. I feel the limitations of humans are quite a feature when you think about what the experience of life would be like if you couldn’t forget or experienced things for the first time. If you already knew everything and you could achieve almost anything with zero effort. It actually sounds…insufferable. | | |
| ▲ | te0006 13 hours ago | parent [-] | | You might find Stanislav Lem's Golem XIV worth a read, in which a what we now call an AGI shares, amongst other things, its knowledge and speculations about long-term evolution of superintelligences, in a lecture to humans, before entering the next stage itself.
https://www.goodreads.com/book/show/10208493
It seems difficult to obtain an English edition these days but there is a reddit thread you might want to look into. |
|
|
|
|
|
|
|
| ▲ | OgsyedIE a day ago | parent | prev | next [-] |
| Unfortunately the current system is doing a bad job of finding replacements for dwindling crucial resources such as petroleum basins, new generations of workers, unoccupied orbital trajectories, fertile topsoil and copper ore deposits. Either the current system gets replaced with a new system or it doesn't. |
|
| ▲ | a day ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | pydry a day ago | parent | prev | next [-] |
| >History says that actually when this happens, an entire generation is yeeted on to the streets History hasnt had to contend with a birth rate of 0.7-1.6. It's kind of interesting that the elite capitalist media (economist, bloomberg, forbes, etc) is projecting a future crisis of both not enough workers and not enough jobs simultaneously. |
| |
| ▲ | wkat4242 a day ago | parent | next [-] | | I don't really get the American preoccupation with birth rates. We're already way overpopulated for our planet and this is showing in environmental issues, housing cost, overcrowded cities etc. It's totally a great thing if we start plateauing our population and even reduce it a bit. And no we're not going extinct. It'll just cause some temporary issues like an ageing population that has to be cared for but those issues are much more readily fixable than environmental destruction. | | |
| ▲ | NitpickLawyer 17 hours ago | parent | next [-] | | > I don't really get the American preoccupation with birth rates. Japan is currently in the finding out phase of this problem. | |
| ▲ | yoyohello13 a day ago | parent | prev | next [-] | | I think it’s more of a “be fruitful and multiply” thing than an actual existential threat thing. You can see many of loudest people talking about it either have religious undertones or want more peasants to work the factories. Demographic shift will certainly upset the status quo, but we will figure out how to deal with it. | |
| ▲ | ahtihn 17 hours ago | parent | prev | next [-] | | The planet is absolutely not over populated. Overcrowded cities and housing costs aren't an overpopulation problem but a problem of concentrating economic activity in certain places. | | | |
| ▲ | torlok a day ago | parent | prev | next [-] | | Don't try to reason with this population collapse nonsense. This has always been about racists fearing that "not enough" white westerners are being born, or about industrialists wanting infinite growth. For some prominent technocrats it's both. | | |
| ▲ | gmoot 21 hours ago | parent [-] | | The welfare state is predicated on a pyramid-shaped population. Also: people deride infinite growth, but growth is what is responsible for lifting large portions of the population out of poverty. If global markets were repriced tomorrow to expect no future growth, economies would collapse. There may be a way to accept low or no growth without economic collapse, but if there is no one has figured it out yet. That's nothing to be cavalier about. | | |
| ▲ | pydry 21 hours ago | parent [-] | | The welfare state isnt predicated on a pyramid shape but the continued growth of the stock market and endless GDP growth certainly is. >infinite growth, but growth is what is responsible for lifting large portions of the population out of poverty It's overstated. The preconditions for GDP growth - namely lack of war and corruption are probably more responsible than the growth itself. |
|
| |
| ▲ | ttw44 7 hours ago | parent | prev | next [-] | | We are not overpopulated. I hate the type of people that hammer the idea that society needs to double or triple the birthrate (Elon Musk), but as it currently stands, countries like South Korea, Japan, USA, China, and Germany risk extinction or economic collapse in 4-5 generations if the birth rate doesn't rise or the way we guarantee welfare doesn't change. | |
| ▲ | luxardo 10 hours ago | parent | prev | next [-] | | We are most certainly not "overpopulated" in any way. Usage per person is what the issue is. And no society, ever, has had a good standard of living with a shrinking population. You are advocating for all young people to toil their entire lives taking care of an ever-aging population. | |
| ▲ | alxjrvs a day ago | parent | prev | next [-] | | Racist fears of "replacement", mostly. | |
| ▲ | chipsrafferty 18 hours ago | parent | prev | next [-] | | It's the only way to increase profits under capitalism in the long term once you've optimized the technology. | |
| ▲ | mattnewton a day ago | parent | prev [-] | | I think a good part of it is fear of a black planet. |
| |
| ▲ | KaiserPro 12 hours ago | parent | prev [-] | | > History hasnt had to contend with a birth rate of 0.7-1.6. I think thats just not true: https://en.wikipedia.org/wiki/Peasants%27_Revolt A large number of revolutions/rebellions are caused by mass unemployment or famine. |
|
|
| ▲ | torlok a day ago | parent | prev | next [-] |
| Hayek has been lobbied by US corporations so hard for so long that regular people treat the invisible hand of the market like it's gospel. |
|
| ▲ | baq 16 hours ago | parent | prev [-] |
| > So no, the stock market will not be growing because of AI, it will be in spite of it. The stock market will be one of the very few ways you will be able to own some of that AI… assuming it won’t be nationalized. |