Remix.run Logo
TeMPOraL 16 hours ago

> Global governments go to extreme lengths to prevent the proliferation of nuclear weapons. If there were no working restrictions on the development of the tech and the acquisition of needed materials, every country and large military organization would probably have a nuclear weapons program.

Nuclear is special due to MAD doctrine; restrictions are aggressively enforced for safety reasons and to preserve status quo, much more so than for moral reasons - and believe me, every country would love to have a nuclear weapons program, simply because, to put it frankly, you're not fully independent without nukes. Nuclear deterrent is what buys you strategic autonomy.

It's really the one weird case where those who got there first decided to deny their advantage to others, and most others just begrudgingly accept this state of affairs - as unfair as it is, it's the local equilibrium in global safety.

But that's nukes, nukes are special. AI is sometimes painted like the second invention that could become special in this way, but I personally doubt it - to me, AI is much more like biological weapons than nuclear ones: it doesn't work as a deterrent (so no MAD), but is ideal for turning a research mishap into an extinction-level event.

> Other examples are: human cloning, GMOs or food modification (depends on the country; some definitely have restricted this on their food supply), certain medical procedures like lobotomies.

Human cloning - I'd be inclined to grant you that one, though I haven't checked what's up with China recently. GMO restrictions are local policy issues, and don't affect R&D on a global scale all that much. Lobotomy - fair. But then it didn't stop the field of neurosurgery at all.

> I don’t quite understand your last sentence there, but if I understand you correctly, it would seem to me like Ukraine or Libya are pretty obvious examples of countries that faced nuclear restrictions and could not reproduce their benefits through other means.

Right, the invasion of Ukraine is exactly why no nuclear-capable country will even consider giving nukes up. This advantage cannot be reproduced through other means in enough situations. But I did mean it more generally, so let me rephrase it:

Demand begets supply. If there's a strong demand for some capability, but the means of providing it are questionable, then whether or not they can be successfully suppressed depends on whether there are other ways of meeting the demand.

Nuclear weapons are, again, special - they have no substitute, but almost everyone gains more from keeping the "nuclear club" closed than from joining it. But even as there are international limits, just observe how far nations go to skirt them to keep the R&D going (look no further than NIF - aka. "let's see far we can push nuclear weapons research if we substitute live tests with lasers and a lot of computer simulations").

Biological and chemical weapons are effectively banned (+/- recent news about Russia), but don't provide unique and useful capabilities on a battlefield, so there's not much demand for them.

(Chemical weapons showing up in the news now only strengthens the overall point: it's easy to refrain from using/developing things you don't need - but then restrictions and treaties fly out the window the moment you're losing and run out of alternatives.)

Same for full-human cloning - but there is demand for transplantable organs, as well as better substrate for pharmaceutical testing; the former can be met cheaper through market and black market means, while the latter is driving several fields of research that are adjacent to human cloning, but more focused on meeting the actual demand and coincidentally avoid most of the ethical concerns raised.

And so on, and so on. Circling back to AI, what I'm saying is, AI is already providing too much direct, object-level utility that cannot be substituted by other means (itself being a cheaper substitute for human labor). The demand is already there, so it's near-impossible to stop the tide at this point. You simply won't get people to agree on this.