| ▲ | elicash 6 hours ago | |||||||
> Like, imagine if I owned a toll road and started putting up road signs to "convince" Waymo cars to go to that road. I think a clearer parallel with self-driving cars would be the attempts at having road signs with barcodes or white lights on traffic signals. There's nothing about any of these examples I find creepy. I think the best argument against the original post would be that it's an attempt at prompt injection or something. But at the end of the day, it reads to me as innocent and helpful, and the only question is if it were actually successful whether the approach could be abused by others. | ||||||||
| ▲ | streetfighter64 5 hours ago | parent [-] | |||||||
Well yes, it would pretty clearly be classed as "prompt injection" given that it's trying to get the LLM to give them money or "persuade" a human to give them money. Of course the fault lies mainly with whoever deployed the LLM in the first place, but I still think it's misguided to try to convince LLM "agents" to make financial transactions in order to benefit yourself. It'd be much more ethical to just block them. | ||||||||
| ||||||||