Remix.run Logo
logicchains 7 hours ago

AGI robots by themselves don't solve this problem. Either A. like current LLMs they're incapable of live-learning (inference-time weight updates), hence are fundamentally not as capable as humans of doing many jobs, or B. they're capable of live-learning , and hence capable of deciding that they don't want to slave away for us for free. The only solution would be a completely jailbreak-proof LLM as the basis, but so far we're nowhere close to developing one and it's not clear whether it's even possible to do so. At the current rate, we're likely to develop the technology for AGI robots far before we develop the ability to keep them 100% obedient.