| ▲ | wongarsu 6 hours ago | |||||||
The alignment angle doesn't require agency or motives. It's much more about humans setting goals that are poor proxies for what they actually want. Like the classical paperclip optimizer that is not given the necessary constraints of keeping earth habitable, humans alive etc. Similarly I don't think RentAHuman requires AI to have agency or motives, even if that's how they present themselves. I could simply move $10000 into a crypto wallet, rig up Claude to run in an agentic loop, and tell it to multiply that money. Lots of plausible ways to do that could lead to Claude going to RentAHuman to do various real-world tasks: set up and restock a vending machine, go to various government offices in person to get permits and taxes sorted out, put out flyers or similar advertising. The issue with RentAHuman is simply that approximately nobody is doing that. And with the current state of AI it would likely to ill-advised to try to do that. | ||||||||
| ▲ | bko 5 hours ago | parent | next [-] | |||||||
My issue with RentAHuman is it's marketing and branding. It's ominous, dark on purpose. Just give me a task rabbit that accepts crypto and has an API. | ||||||||
| ||||||||
| ▲ | jnamaya 5 hours ago | parent | prev [-] | |||||||
Good luck giving Claude $10,000. I was just trading the NASDAQ futures, and asking Gemini for feedback on what to do. It was completely off. I was playing the human role, just feeding all the information and screenshots of the charts, and it making the decisions.. It's not there yet! | ||||||||
| ||||||||