| ▲ | rgbrgb a day ago | |||||||||||||
to be fair, that's exactly what's at issue. controlling AI implies controlling society as intelligence scales. | ||||||||||||||
| ▲ | Nasrudith a day ago | parent [-] | |||||||||||||
This is singulitarian fallacies all over again like 'being able to make something smarter than a human means infintely smart, because it can just keep on making one smarter' while ignoring the multifaceted nature of intelligence, the time and other costs involved in creation and the costs. It just gets handwaved away as superintelligence somehow enabling goddamned sorcery to ignore physical constraints. Except reality does not work that way. It reminds me of the 'Einstein's superintelligent cat' refutation to such fallacies. It went something like this: imagine Einstein has a superintelligenct cat. The room has only one door and it is locked. The cat is not capable of opening the lock due to lack of manual dexterity. The cat does not want to go into the carrier. Einstein is however an order of magnitude greater in mass. As much as the cat might want to escape Albert Einstein's grip he cannot. The superintelligent cat is going in the carrier. The point being that, no, controlling or creating AI does not in fact equate to controlling society no matter how smart it gets. Even if we were so incredibly stupid to wire it up to be able to actually control an entire munitions factory it still can't take over society, and it only takes one bombing run or called in artillery strike to end the situation. Yet in the real world we can trust private ownership of firearm factories, missile factories, and tank factories without a serious risk of a coup. Yet somehow AI is supposed to be what makes them a god-king? It strains credulity. | ||||||||||||||
| ||||||||||||||