| ▲ | gnarlouse 7 hours ago |
| My (paranoid) unpopular take: the AI boom we’re currently experiencing is a concerted effort by the billionaires to maintain operational agency (the ability to think and do at a massive scale) once society begins to collapse due to climate change. ~~ edit ~~ Thank you for the sane responses. I’m reconsidering how much I believe this. |
|
| ▲ | fmbb 7 hours ago | parent | next [-] |
| How would that work?
AI cannot run if society collapses. Maintaining all that infrastructure and supplying spare parts is not going to work. Also AI cannot do anything on its own. Barely anything with support from humans. |
| |
| ▲ | mariusor 6 hours ago | parent | next [-] | | This is also my reasoning for why I think AI alignment is not going to be a problem for humanity any time soon. By the time AI will be capable of maintaining the whole supply chain required to keep itself running sufficient time will have passed so we can come up with something viable. | | | |
| ▲ | Teknomadix 6 hours ago | parent | prev | next [-] | | Long before 2100, critical AI system will no longer be operating from this soil. They are in Earths orit, and on its moon. | | |
| ▲ | vardump 6 hours ago | parent [-] | | And the industrial base that maintains it? Chips have a limited lifespan. |
| |
| ▲ | ceejayoz 6 hours ago | parent | prev | next [-] | | > AI cannot run if society collapses. That doesn’t mean some idiot billionaires huffing each others’ farts can’t think it can. | |
| ▲ | ademup 6 hours ago | parent | prev [-] | | Respectfully disagree. An AI with full access to robots could do everything on its own that it would need to "survive" and grow. I argue that humans are actually in the way of that. | | |
| ▲ | mariusor 6 hours ago | parent | next [-] | | "robots" is a very hand wavy answer. There's so much that goes into the supply chain of improving and running AI that I, a human, feel quite safe. | | |
| ▲ | malwrar 5 hours ago | parent [-] | | Is there any particular element of the supply chain that you feel make “robots” hand-wavy? | | |
| ▲ | nosianu 2 hours ago | parent | next [-] | | In support of the other reply, here is a look at the supply chain of a very simple product - a can of Coke. https://medium.com/@kevin_ashton/what-coke-contains-221d4499... (https://archive.md/PPYez) The highlighted parts are a kind of TL;DR, but in the context here actually reading it - it is not much - is actually required to get anything out of it for the arguments used here. Anything technological is orders of magnitude more complex. Pointing to any single part really makes no sense, the point is the complexity and interconnectedness of everything. Some AI doing everything is harder than the East Bloc countries attempting to use central planning for the whole economy. Their economy was much more simple than what such a mighty AI would require for itself and its robot minions. And that's just the organization. I did like "Gaia" in Horizon Zero Dawn (game) because it made a great story though. This would be pretty much exactly the kind of AI fantasized about here. Douglas Adams hints at hidden complexity towards the end of HHGTTG, talking about the collapse of Golgafrincham's society. You overlook just one single tiny thing and it escalates to failure from there. Biological systems don't have that problem, they are self-assembling no matter how you slice and dice them. You may just end up with a very difference eco-system, but as long as the environment is not completely outside the useful range it will grow and reorganize. human-made engineered things on the other hand will just fail and that's it, they will not rise on their own from nothing. Human-made systems are much much more fragile than biological ones (even if you can't guarantee the kind of biological system you will get after rounds of growth and adaptations). | |
| ▲ | mariusor 4 hours ago | parent | prev [-] | | The length and breadth of it mostly. |
|
| |
| ▲ | kubb 6 hours ago | parent | prev | next [-] | | I think this is a very common opinion here. I'd say at least 15% people believe that. | |
| ▲ | mtlmtlmtlmtl 6 hours ago | parent | prev [-] | | Yeah? How many robots? What kind of robots? What would the AI need to survive? Are the robots able to produce more robots? How are the robots powered? Where will they get energy from? Sure it's easy to just throw that out there in one sentence, but once you actually dig into it, it turns out to be a lot more complicated than you thought at first. It's not just a matter of "AI" + "Robots" = "self-sustaining". The details matter. |
|
|
|
| ▲ | andybak 7 hours ago | parent | prev | next [-] |
| This makes no sense. It takes a complex industrial society to keep that tech going. The supply chain to make GPUs would not survive even a modest disruption in the world economy. It's probably the most fragile thing we currently manufacture. |
| |
| ▲ | ben_w 7 hours ago | parent | next [-] | | If you're an AI company and you believe your own hype (like Musk seems to), you'll probably believe that you can automate everything from digging minerals out of the ground all of the way up to making the semiconductors in the robots that dig the minerals. As you may infer from my use of the word "hype", I do not think we are close to such generality at a high enough quality level to actually do this. | | |
| ▲ | SoftTalker 7 hours ago | parent | next [-] | | Presumes that the surviving humans will not actively disrupt/destroy these automated industries. Which seems highly likely as they will want to scavenge them for anything of value or repurpose them for their own means. | | |
| ▲ | ben_w 6 hours ago | parent [-] | | There's lots of implicit assumptions or this would be a book, but remember that Musk has a rocket and wants to colonise Mars, and that Mars is so bad that it is currently 100% populated by robots. For the billionaires without rockets, there's also a whole bunch of deserts conveniently filled with lots of silicon. (Or as Mac(Format|World|User) put it sometime in the 90s when they were considering who might bail out Apple and suggested one of the middle east oil barrons, a "silly con"). | | |
| ▲ | SoftTalker 5 hours ago | parent [-] | | Musk smokes a lot of weed. We won't have a colony on Mars in his grandshildren's lifetime. | | |
| ▲ | ben_w 5 hours ago | parent [-] | | His lifetime, I agree unlikely, but also I think that will be short: he's pissed off too many other powerful people and will get the western equivalent of Russian oligarchs "falling out of a window". The economics he talks about are all nonsense. No bank will lend someone $200k for the ticket to go to Mars on the offchance they might be a successful pizza restaraunteur. But like I said, if you're (e.g.) him and you buy your own hype… (His grandkids' lifetimes are another question entirely. Things are changing too fast). |
|
|
| |
| ▲ | gnarlouse 7 hours ago | parent | prev [-] | | While I believe we’re in a slow takeoff, I believe we are in a takeoff. The important question to my mind is whether AGI comes before systemic societal collapse due to climate change. I think it does, and my tin foil hat grows a wider brim with each passing day. I hope I’m wrong! |
| |
| ▲ | throwaway0123_5 6 hours ago | parent | prev | next [-] | | This is also why I'm skeptical of claims that it would be impossible (or nearly so) for governments to meaningfully regulate AI R&D/deployment (regardless of whether or not they should). The "you can't regulate math" arguments. Yeah, you can't regulate math, but using the math depends on some of the most complex technologies humanity has produced, with key components handled by only one or a few companies in only a handful of countries (US, China, Taiwan, South Korea, Netherlands, maybe Japan?). US-China cooperation could probably achieve any level of regulation they want up to and including "shut it all down now." Likely? Of course not. But also not impossible if the US and China both felt sufficiently threatened by AI. The only thing that IMO would be really hard to regulate would be the distribution of open-weight models existing at the time regulations come into effect, although I imagine even that would be substantially curtailed by severe enough penalties for doing so. | |
| ▲ | gnarlouse 7 hours ago | parent | prev | next [-] | | This is the best argument I’ve heard against it, so thanks. My anxiety entirely orbits around the scale of AI compute we’ve reached and the sentiment that there is drastic room for improvement, the rapidly advancing state of the art in robotics, and the massive potential for disruption of middle/lower class stake in society. Not to mention the general sentiment that the economy is more important than people’s well being in 99.9% of scenarios. | |
| ▲ | a2128 6 hours ago | parent | prev [-] | | Who's to say it has to keep moving forward? The companies are buying up massive amounts of GPUs in this AI race, a move that's widely questioned because next year's GPUs might render the current ones outdated[0], so there will probably be plenty of GPUs to go around if the CEO demands it (prior to collapse). Operating datacenters would probably be out of the question with a collapsed society as the power grid might be unreliable, global networks might be down and securing many datacenters would probably be difficult, but there's at least one public record of a billionaire building his own underground bunker with off-grid power generation and enough room to have his own little datacenter inside[1]. "Ordinary" people will acquire 32GB GPUs or Mac Studios for local open-source LLM inference, so it seems likely billionaires would just do the next step up for their bunker and use their company's proprietary weights on decommissioned compute clusters. [0] https://www.cnbc.com/2025/11/14/ai-gpu-depreciation-coreweav...
[1] https://www.businessinsider.com/mark-zuckerberg-hawaii-under... |
|
|
| ▲ | forinti 7 hours ago | parent | prev | next [-] |
| If there's an evil plot, it's goal must surely be to accelerate environmental degradation. First we had the blockchain, now AI to consume enormous amounts of resources and distract us from what we should be investing in to make the environment healthier. |
|
| ▲ | barbazoo 7 hours ago | parent | prev | next [-] |
| Concerted effort among the greediest people in the world all competing with each other? I find that very hard to imagine. |
|
| ▲ | dkdcio 7 hours ago | parent | prev [-] |
| do you think it’s one person or a group of them that meets? design by committee? how are they getting it all done? let’s hear it! |
| |
| ▲ | bryanrasmussen 5 hours ago | parent | next [-] | | I guess it's whoever was in that Doug Rushkoff meeting with the whole idea we'll have security forces with those exploding dog collars to keep them in line and to keep revolutionary forces from killing us and taking our food supply! https://english.elpais.com/technology/2023-09-20/writer-doug... | |
| ▲ | exe34 7 hours ago | parent | prev | next [-] | | it's very easy to achieve great things without coordination if you can just do what's best for yourself and help your peers achieve their collective goals. but they do meet at davos every now and again, without the democratic shackles. | |
| ▲ | gnarlouse 7 hours ago | parent | prev [-] | | I don’t know if I believe it’s an active conspiracy. Instead I think it’s more of a very concerning, very plausible eventuality. | | |
| ▲ | dkdcio 6 hours ago | parent [-] | | FWIW I do agree with the operational agency at scale bit and I’m always fascinated by these conspiracy theories, was genuinely hoping to get one (but also happy to see you’re challenging your own position). the idea of people coordinating on these things is very funny to me I think like all tech people will use it for good and bad. those in power have more power etc etc I think it tends to boil down to whether you believe people are, overall, good or bad. over time, that’s what you’ll get with use of tech | | |
| ▲ | gnarlouse 6 hours ago | parent [-] | | You should go see "Bugonia" by Yorgos Lanthimos, if you haven't yet, then! That movie might be straight up your alley. |
|
|
|