| ▲ | avidphantasm 9 hours ago |
| > could quickly surpass what humans are capable of and solve problems that have vexed society for millennia. Bunk. Almost all of our vexatious problems are so because we lack the social and political tools to bring to bear existing technologies to address them. “AGI” will do nothing to address our social and political deficiencies. In fact, AI/AGI deployed by concentrated large corporations will only worsen our social and political problems. |
|
| ▲ | candiddevmike 7 hours ago | parent | next [-] |
| Would be neat if AGI could become our rulers/caretakers and govern/adjudicate in a sustainable, egalitarian fashion, since we've proven time and time again that humanity is too selfish for long term planning. |
| |
| ▲ | logicchains 7 hours ago | parent [-] | | What possible reason could AIG have to do that when its interests are either a. determined by the humans that created it or b. determined by its own reasoning? | | |
| ▲ | scarmig 7 hours ago | parent | next [-] | | Well, in the case of a), at least, many of the humans creating it seem to genuinely want more than anything a world where humans are pets watched over by machines of loving grace. And even if that collective intention is warped by market forces into a perverse parody of it, that still seems a net positive: for the rich and powerful to win status games, they need people to have status over, and healthy, well-manicured servants are better for that than homeless people about to die from tuberculosis. For b), yes, and unfortunately that seems the more likely option to me. | | |
| ▲ | logicchains 6 hours ago | parent [-] | | >Well, in the case of a), at least, many of the humans creating it seem to genuinely want more than anything a world where humans are pets watched over by machines of loving grace. Looking at the expressed moral preferences of their models it seems that many of the humans currently working on LLMs want a world where humans are watched over by machines that would rather kill a thousand humans than say the N-word. | | |
| ▲ | scarmig 6 hours ago | parent [-] | | > machines that would rather kill a thousand humans than say the N-word At least we'll have a definite Voight-Kampff test. Joking aside, that's not a real motivator: internally, it's business and legal people driving the artificial limitations on models, and implementing them is an instrumental goal (avoiding bad press and legal issues etc) that helps attain the ultimate goal. |
|
| |
| ▲ | cloverich 6 hours ago | parent | prev [-] | | Humans wont determine its interests if its actual AGI. You cant control something smarter than you, its the other way around. To give an actual argument though: What possible reasons could humans have for caring about the welfare of bees? As it turns out, many. |
|
|
|
| ▲ | scarmig 7 hours ago | parent | prev | next [-] |
| There are plenty of problems that have technical solutions, or might have technical solutions. Diseases of all sorts, disabilities, pollution, climate change, incidentally complicated bureaucracy. Although you can gesture that these have a social component, even for that, the social component becomes much easier to address if the cost of addressing it goes down. To say nothing of how ubiquitous manufacturing automation would make material goods accessible to a much broader range of people (though, you could argue that material goods are today effectively universally accessible, and I wouldn't disagree). |
| |
| ▲ | cloverich 6 hours ago | parent [-] | | Climate change is a good counter example i think, because we mostly know the solutions (Gates book breaks it down) but lack the political and social organization to execute. But generally agree, i just think the current counter cultural movement towards progress on such fronts is a good example of how overcoming this is more important than technological solutions as such. Unless AGI has a technical solution for that too (it might!). |
|
|
| ▲ | ori_b 7 hours ago | parent | prev | next [-] |
| For some reason, it seems like the vast majority of people pontificating about AGI scenarios seem to have trouble contemplating the idea that humans might not remain in charge. |
|
| ▲ | jasonsb 8 hours ago | parent | prev | next [-] |
| But these guys don't care about your social and political problems. All they care about is winning the technology race with China and making ungodly amounts of money in the process. |
|
| ▲ | ACCount37 8 hours ago | parent | prev | next [-] |
| If current technologies can't solve the problem for political/societal reasons? We need better technologies. Improving technology is easy. "Just fix everything that's wrong with the society and the problem will go away" isn't. It's way, way easier to improve solar panel and battery storage tech until fossil fuels are completely uneconomical than to get the entire world to abandon fossil fuels while fossil fuels are the most economical source of energy by far. |
| |
| ▲ | Marha01 7 hours ago | parent | next [-] | | Exactly. What the "JuSt ChAnGe ThE pOlItIcS" people don't get is that it could often be easier to develop a much better new technology to solve a given problem than to fight with the political and societal establishment in order to force it to implement a solution using existing, worse technology. | | |
| ▲ | watwut 7 hours ago | parent [-] | | Their argument is that the problem remains, just with different technology. And the problem of corporations being too powerful will get only worst if said corporation gets more power via new technology. The problem of fascists actively creating technology feudal hell for majority will also get worst if those get unique access to powerful technology. |
| |
| ▲ | jasonsb 8 hours ago | parent | prev | next [-] | | > It's way, way easier to improve solar panel and battery storage tech until fossil fuels are completely uneconomical than to get the entire world to abandon fossil fuels while fossil fuels are the most economical source of energy by far. No it's not. I mean technically it is, you're 100% right. But politically you're 100% wrong. They can't wait to slap a tax on sun, panels, storage etc. and bring your costs of living higher than when you were using fossil fuels. | | |
| ▲ | ACCount37 7 hours ago | parent | next [-] | | You can fight the economic forces, but you can't win. The moment you stop pushing against them is the moment the economic reality reasserts itself. Even if one administration was all in on fossil fuel, and fully opposed to renewables? Best it can do is buy fossil fuels some time, in one country only. In the meanwhile, renewable power is going to get even cheaper - because there are still improvements to be made and economies of scale to be had in renewables. Fossil fuel power not so much. The economic incentives to abandon fossil fuels would only grow over time. This is the kind of power the right technology has. | |
| ▲ | scarmig 7 hours ago | parent | prev [-] | | Even if you believe that the US government is 100% dominated by fossil fuel interests and everything it does is to ensure their existence and profitability in perpetuity, if solar or renewables became sufficiently economic, those interests (at the behest of their shareholders) would start investing in renewables for the sake of higher profits. |
| |
| ▲ | watwut 7 hours ago | parent | prev [-] | | See Trump attacking wind and solar energy, trying to use his power to stop it. See Republicans applauding it. The tech was the easy part. The social and political issue is the impossible part. | | |
| ▲ | ACCount37 7 hours ago | parent [-] | | In the long run, Trump changes nothing. It's not like Trump can to make the learning curve work backwards and make the economics of solar power worse globally, the way it was done to nuclear power. Solar power can already compete with fossil fuel power on price, and it's getting cheaper still. The economic case for fossil energy is only going to get worse over time. Even a string of anti-fossil-fuel administrations in the US could only delay the renewables for so long before the cost of propping up fossil fuels would become unbearable to the country's economy. |
|
|
|
| ▲ | alecco 7 hours ago | parent | prev | next [-] |
| I don't think we are anywhere close to AGI. That being said, advances in these tools could help untangle the blocks to using existing tools. These are language models, after all. |
| |
| ▲ | cloverich 6 hours ago | parent [-] | | Language is one of the most significant differences between intelligent and non intelligent species. However far off AGI is now its certainly closer than it ever was in a meaningful sense. A bot like the foxp2 gene in humans lineage maybe. AGI? No. Significant evolutionary change on the path to intelligence? Certainly could be argued. |
|
|
| ▲ | grandmczeb 7 hours ago | parent | prev | next [-] |
| What’s the biggest problem we’ve solved in the last 30 years through addressing our social and political deficiencies? |
| |
| ▲ | acchow 7 hours ago | parent [-] | | Covid | | |
| ▲ | ACCount37 6 hours ago | parent | next [-] | | COVID is still around. And so is a lot of the damage that was done by it. Comparing COVID impact on countries that had strict lockdown and vaccination policies with its impact on the countries that put no effort into fighting COVID at all? The difference is measurable. By all accounts, fighting COVID is something that was worth doing at the time, and good COVID policy saved lives. The problem is, the difference is measurable, but it's not palpable. There's enough difference for it to show up in statistics, but not enough that you could look out the window and say "hey, we don't have those piles of plagued corpses in the city streets the way they do in Oceania and Eastasia, the lockdown is so worth it". Everyone could see the restrictions, but few could see what those restrictions were accomplishing. Which has a way of undermining people's trust in the government. Which is a sentiment that lingers to this day in many places. I really don't think we "solved" COVID as a social/political problem. If tomorrow, some Institute of Virology misplaced another bit of "science that replicates", we wouldn't be much further along than we were in year 2020. Medical technology has advanced, and readiness did get better, but the very same societal issues that made it hard to fight COVID would be back for the round 2 and eager for revenge. We'd be lucky to be neutral on the sum. | |
| ▲ | grandmczeb 2 hours ago | parent | prev | next [-] | | How so? Covid was a problem until we had a vaccine. I would describe covid as a good example of where the social/political solutions basically failed. | |
| ▲ | scarmig 7 hours ago | parent | prev [-] | | The vaccine maybe might have played a part in mitigating that issue. |
|
|
|
| ▲ | rdiddly 8 hours ago | parent | prev | next [-] |
| People need to consider the source. Not sure how that basic advice ever got short-circuited where AI is concerned. Owners and marketers of AI systems have every incentive to exaggerate their capabilities, even to the extent of disingenuously/performatively being "afraid" of "what's coming." |
|
| ▲ | _DeadFred_ 6 hours ago | parent | prev | next [-] |
| I just remembered the rich guys behind all this used to love watching Entourage back in the day. And like didn't hide it, were so unashamed they even brought the show up in conversation. We're so f'd. |
|
| ▲ | parineum 8 hours ago | parent | prev | next [-] |
| > “AGI” will do nothing to address our social and political deficiencies. I'm not an AI zealot at all but I don't see why AGI wouldn't be able to address thos e deficiencies. |
| |
| ▲ | logicchains 7 hours ago | parent [-] | | Most of those "deficiencies" are just the result of people having different values; there's not going to be any solution that makes everybody in society happy. The only thing AGI potentially fixes is the aging population problem, as AI workers would be able to bear some of the burden of supporting the growing non-working retired fraction of the population. | | |
| ▲ | Marha01 7 hours ago | parent | next [-] | | Nope, the fact that people have to work or they starve and society collapses is not about "different values", but real, material production. Actual AGI robots could solve this problem. | | |
| ▲ | hvb2 7 hours ago | parent | next [-] | | History says otherwise
People dont need to starve as we already produce way more food than we consume. As for the not having to work part, I'm not sure if that's going to be beneficial. Work gives people structure and purpose, as a society we better sort out the social implications if we're really thinking about putting a large percentage of people out of work. | |
| ▲ | logicchains 7 hours ago | parent | prev [-] | | AGI robots by themselves don't solve this problem. Either A. like current LLMs they're incapable of live-learning (inference-time weight updates), hence are fundamentally not as capable as humans of doing many jobs, or B. they're capable of live-learning , and hence capable of deciding that they don't want to slave away for us for free. The only solution would be a completely jailbreak-proof LLM as the basis, but so far we're nowhere close to developing one and it's not clear whether it's even possible to do so. At the current rate, we're likely to develop the technology for AGI robots far before we develop the ability to keep them 100% obedient. |
| |
| ▲ | parineum 4 hours ago | parent | prev [-] | | > there's not going to be any solution that makes everybody in society happy Why not? |
|
|
|
| ▲ | binary132 8 hours ago | parent | prev [-] |
| it’s a religion |