| ▲ | saltyoldman a day ago |
| The countries that they're in already do via the law. No one else should "control" someone. |
|
| ▲ | rolph a day ago | parent | next [-] |
| no one should have to control some one, until they become a threat. when someone presents a threat, at large, they have limited entitlement to walk among society, or act without review. |
| |
| ▲ | JumpCrisscross a day ago | parent [-] | | > no one should have to control some one, until they become a threat The Helots were a threat to Spartans. Black Haitians to the French. Jews to the Reich. Threats feel like a reasonable reason to reduce another’s rights. But they turn out to be the most usual way of tricking oneself into becoming a monster. | | |
| ▲ | gobdovan a day ago | parent | next [-] | | I am starting to believe a significant number of humans run a computation that goes something like this: "Can I control AI? Will I meet people that control AI personally? If no, why would I care if they're treated unfairly in the abstract? Most important thing for me is they don't affect my resources in any way. They're better off than most either way, if anything not willingly reducing their power shows greed and confirms they're threats." | | |
| ▲ | JumpCrisscross a day ago | parent [-] | | I interpret it more generously. When a pet or a child misbehaves, we constrain their behavior. For most people, I’d guess that’s the majority of bad behavior they come across in daily life. (When adults misbehave, one usually distances or confronts. The latter isn’t an option for a difficult-to-reach public figure. And some of these figures make distancing difficult, too.) |
| |
| ▲ | monknomo a day ago | parent | prev | next [-] | | Are you comparing the ai ceos to helots? I am confused | |
| ▲ | rolph a day ago | parent | prev | next [-] | | i fixed that for you: "The Spartans were a threat to Helots. the French to Black Haitians. the reich to the Jews." justification, doesnt transform a victim into a threat. | | |
| ▲ | Nasrudith a day ago | parent | next [-] | | The whole point is that the self-fulfilling prophecy and their own cruelty which created the victims is exactly what threatened them later. One reducto-as-absurdum hypothetical I give of this type of self-fulfilling prophecy from fallacious logic is that if group A decided that say, all redheads were all vicious bandits who would kill them on sight and therefore should be killed, guess who is now incentivized to kill Group A on sight? | |
| ▲ | metalman 20 hours ago | parent | prev [-] | | I'll do a bit more "fixing" "justification, doesnt transform a victim into a threat" unless the victim is Palestinian, and the
monsters are jewdaic zionist terrorists, for more than 100 years now | | |
| ▲ | rolph 15 hours ago | parent [-] | | oops you broke it again stahlmann, a victim is a victim period. it doesnt matter who is right or wrong at the start, there is the attacker, and the attacked. victim, and attacker swap places as they go around the wheel. now what breaks every thing is when a militant in combat is spun as a victim, defending from mother and child. generalizing based on nationality or eye color or anything else that is the actual problem you seem to be concerned about. let that be your last battlefield. https://en.wikipedia.org/wiki/Let_That_Be_Your_Last_Battlefi... |
|
| |
| ▲ | npfo-hn a day ago | parent | prev [-] | | Congratulations! You just compared regulating the behavior of a handful of billionaires to the holocaust! You just equated the idea that there should be some democratic restrictions based on corporate activity with death camps that murdered millions! You win the "most HN post of the month" award.
Never change, HN. Never change. | | |
| ▲ | npfo-hn a day ago | parent | next [-] | | "Jews to the Reich." Yes they did. | |
| ▲ | JumpCrisscross a day ago | parent | prev | next [-] | | > You just compared regulating the behavior of a handful of billionaires to the holocaust! On the most surface level, sure. Regulating something and controlling someone are, to me, different motivations. | |
| ▲ | operatingthetan a day ago | parent | prev [-] | | >You just compared regulating the behavior of a handful of billionaires to the holocaust! They literally did not. |
|
|
|
|
| ▲ | rgbrgb a day ago | parent | prev | next [-] |
| to be fair, that's exactly what's at issue. controlling AI implies controlling society as intelligence scales. |
| |
| ▲ | Nasrudith a day ago | parent [-] | | This is singulitarian fallacies all over again like 'being able to make something smarter than a human means infintely smart, because it can just keep on making one smarter' while ignoring the multifaceted nature of intelligence, the time and other costs involved in creation and the costs. It just gets handwaved away as superintelligence somehow enabling goddamned sorcery to ignore physical constraints. Except reality does not work that way. It reminds me of the 'Einstein's superintelligent cat' refutation to such fallacies. It went something like this: imagine Einstein has a superintelligenct cat. The room has only one door and it is locked. The cat is not capable of opening the lock due to lack of manual dexterity. The cat does not want to go into the carrier. Einstein is however an order of magnitude greater in mass. As much as the cat might want to escape Albert Einstein's grip he cannot. The superintelligent cat is going in the carrier. The point being that, no, controlling or creating AI does not in fact equate to controlling society no matter how smart it gets. Even if we were so incredibly stupid to wire it up to be able to actually control an entire munitions factory it still can't take over society, and it only takes one bombing run or called in artillery strike to end the situation. Yet in the real world we can trust private ownership of firearm factories, missile factories, and tank factories without a serious risk of a coup. Yet somehow AI is supposed to be what makes them a god-king? It strains credulity. | | |
| ▲ | stratos123 12 hours ago | parent | next [-] | | These arguments have been going on for more than a decade and have been silly the whole time. > It reminds me of the 'Einstein's superintelligent cat' refutation to such fallacies. One (of the many) problem(s) with this "refutation" is that in reality not only does nobody bother to lock the superintelligent cat in room and leave it no available actions, but you're lucky if they don't hook the cat up directly to the internet. It doesn't matter whether you could maybe control a superintelligence, if you were very careful and treating it very seriously, when nobody is even trying, much less being very careful. | |
| ▲ | pixl97 18 hours ago | parent | prev [-] | | >Yet in the real world we can trust private ownership of firearm factories, missile factories, and tank factories without a serious risk of a coup Because they are highly fucking regulated.... Start selling missiles to kids and watch yourself get put in a cage. |
|
|
|
| ▲ | bigyabai a day ago | parent | prev [-] |
| The law is only relevant insofar as it's enforced. In America, that's a tossup. |
| |
| ▲ | SilentM68 a day ago | parent [-] | | Good point. People do not think of a scenario where one billionaire might decide to take their wealth and resources and hunker down on a dictator-controlled country where extradition does not apply, that person could easily experiment and create an AI that may not necessarily see us as relevant to their existence. I probably won't be able to respond to this comment since some people on this forum have flagged my comments as inappropriate thus limiting the number of daily posts I can make :) |
|