▲ | israrkhan a day ago | |
while I agree that LLMs do not have thoughts or plan. They are merely text generators. But when you give the text generator ability to make decisions and take actions, by integrating them with real world, there are consequences. Imagine, if this LLM was inside a robot, and the robot had ability to shoot. Who would you blame? | ||
▲ | gchamonlive a day ago | parent | next [-] | |
That depends. If this hypothetical robot was in a hypothetical functional democracy, I'd blame the people that elected leaders whose agenda was to create laws that would allow these kinds of robots to operate. If not, then I'd blame the class that took the power and steered society into this direction of delegating use of force to AIs for preserving whatever distorted view of order those in power have. | ||
▲ | wwweston a day ago | parent | prev [-] | |
I would blame the damned fool who decided autonomous weapons systems should have narrative influenced decision making capabilities. |