| ▲ | SpicyLemonZest 12 hours ago | |
There were historical worries about whether a ban would be feasible, but frontier AI research as we understand it today requires large amounts of specialized compute. Even if we couldn't or wouldn't destroy the chips, we could imprison anyone who tries to start a large training run, the same way we imprison anyone who tries to buy enriched uranium. | ||
| ▲ | neonstatic an hour ago | parent [-] | |
Yes, that is true, but it's not my point. I am not saying it'd be impossible to find people who are doing it. My point is that there will always be a group of people, who'd be willing to do potentially dangerous things as long as those things are possible and are believed to provide some sort of advantage. For that reason, those people would either be in decision making positions or have a good enough offer to decision makers. Speaking of uranium - I don't think AI is anything like it (although the AI industry propaganda really wants us to believe that), but even there we have examples of countries that were pursuing nuclear weapons both successfully and unsuccessfully as well as countries that could have them, but choose not to. So the ban itself isn't necessarily the main point here. | ||