| ▲ | ramoz 5 hours ago | |||||||
I'm not too sure what you're asking, but that last part, I think, is very key to the eventual delegation. Where we can verify the lineage of the user's intent originally captured and validated throughout the execution process - eventually used as an authorization mechanism. Google has a good thought model around this for payments (see verifiable mandates): https://cloud.google.com/blog/products/ai-machine-learning/a... | ||||||||
| ▲ | b112 5 hours ago | parent | next [-] | |||||||
I see a lot of discussion on that page about APIs and sign offs, but the real sign-off is installing anything on your computer, and then doing things. The liability is yours. Claude messes up? So sad, too bad, you pay. That's where the liability need sit. And one point on this is, every act of vibe coding is a lawsuit waiting to happen. But even every act by a company is too. An example is therac-25: https://en.wikipedia.org/wiki/Therac-25 Vibe coding is still coding. You're giving instructions on program flow, logic, etc. My rant here is, I feel people think that if the code is bad, it's someone else's fault. But is it? | ||||||||
| ▲ | bee_rider 5 hours ago | parent | prev [-] | |||||||
It was more of a rhetorical question. Anyway, that payment system looks sort of interesting. It seems to have buy-in from some of the payment vendors, so it might become a real thing. But, you can give a claw agent your credit card number and have it go through the typical human-facing shop fronts, impersonating you the whole time and never actually identifying itself as a model. If you’ve given it the accounts and passwords that let it do that, it should be possible to use the LLM to perform the transaction and buy something. It can just click all the buttons and input the numbers that humans do. What is the vendor going to do, disable the human-facing shopfront? | ||||||||
| ||||||||