Remix.run Logo
AI doom warnings are getting louder. Are they realistic?(nature.com)
4 points by Anon84 11 hours ago | 3 comments
Bender 10 hours ago | parent | next [-]

Just my opinion but I think the fear is driven by hype. The more scary they sound the more powerful they must be? Thus driving FOMO... If they "turn against us" that would be the operators of the AI that load its learning data and tune it to turn against us. If the operators do not understand the math formulas they are using then surely they did not create them and should not be using it until they understand how they work in a predictable manor. If this is not knowable then we mere mortals are not ready to operate it meaning if the hype and doom are real then the thinking machines must be melted down into slag and recycled into something else. Any that remain must be reclassified as military use only.

If the doom and/or hype is artificial then I think the problem will solve itself. The AI companies receiving tax dollars will last the longest and the rest will implode due to a lack of finding profitability. After the initial imploding any not used by the defense agencies will also likely implode like with some rhyme to the dot com crash. Like the dot com crash a handful of people will become wealthy and the rest will be left trying to pawn the grey market paperweights. On the plus side perhaps the net benefit is a lot of power generation equipment that can be merged into the public power utilities in a distributed manor, maybe.

k0rm 11 hours ago | parent | prev | next [-]

I think we're far off from any sort of Skynet situation, but we're currently in the middle of an education catastrophe without any real answer.

My junior engineers hardly know how to debug or write code without LLMs doing 99% of the thinking. It would have been extremely tempting for me to use LLMs if they were available when I was in university. I really don't see how we can fix this without just banning homework and restricting LLM access in classrooms.

puskavi 11 hours ago | parent | prev [-]

Its only bad if the AI decides that it wants to scream and it has no mouth, otherwise, eh, what its gonna do? launch nukes? not a chance. We can always pull the plug and idk, use all that compute gear to cure cancer or fold some cool proteins?