Remix.run Logo
roxolotl a day ago

It helps with sales because they position it as “we can give you the power to end the world.” There’s plenty of people who want to wield that sort of power. It doesn’t have to be 4D chess. Maybe they are being genuine. But it is helping sales.

DennisP a day ago | parent | next [-]

They're not saying today's AI has that kind of power, and they're not saying future superintelligent AI will give you that power. They're saying it will take all power from you, and possibly end you.

If this is some kind of twisted marketing, it's unprecedented in history. Oil companies don't brag about climate change. Tobacco companies don't talk about giving people cancer. If AI companies wanted to talk about how powerful their AI will be, they could easily brag about ending cancer, curing aging, or solving climate change. They're doing a bit of that, but also warning it might get out of control and kill us all. They're getting legislators riled up about things like limiting data centers.

People saying this aren't just company CEOs. It's researchers who've been studying AI alignment for decades, writing peer reviewed papers and doing experiments. It's people like Geoffrey Hinton, who basically invented deep learning and quit his high-paying job at Google so he could talk freely about how dangerous this is.

This idea that it's a marketing stunt is a giant pile of cope, because people don't want to believe that humanity could possibly be this stupid.

otabdeveloper4 a day ago | parent [-]

> If this is some kind of twisted marketing, it's unprecedented in history.

They're marketing AI to investors, not to end-user plebs.

This is a pump-and-dump scheme.

DennisP a day ago | parent [-]

Exxon has never bragged to investors that they'd burn so much oil, civilization would collapse from climate change. They've always talked about how great fossil fuels are for the economy and our living standards. It makes no sense to sell apocalypse to investors either.

otabdeveloper4 20 hours ago | parent [-]

They're selling FOMO to investors.

"Last chance to jump on the AI train, invest into your future robot overlord or be turned into biodiesel for datacenters in the future."

DennisP 14 hours ago | parent [-]

There's no reason to think an out-of-control ASI would spare its investors.

otabdeveloper4 7 hours ago | parent [-]

There's no reason to think it wouldn't. Shouldn't you hedge your bets?

Also, you can probably make a shitton of money as an out-of-control-AI-investor while the world is in the process of being destroyed.

DennisP 2 hours ago | parent [-]

There are all sorts of things you could do that might make an AI like you, and none of them have more justification than any other. This is not an argument AI firms are making.

I agree that short-term greed is driving investment, but it would drive just as much investment if AI companies were not warning of apocalypse. Probably it would drive even more, because there'd be less risk of regulatory interference, and more future profit to discount into the present.

So why are they making those warnings? It doesn't benefit them. The simplest explanation is that this stuff actually is dangerous, and people who know that are worried.

cyanydeez a day ago | parent | prev [-]

Isn't it more: "We can give you the power to eliminate the people in your organization you dont like" and expands into basically dismantling all government & business for the benefit of the guy with the largest wallet?

It's hard to see as anything but a button anyone with enough money can press and suddenly replace the people that annoy them (first digitally then likely, into flesh).