▲ | what 17 hours ago | |||||||
That would still be on whomever set up the agent and allowed it to take action though. | ||||||||
▲ | mitthrowaway2 17 hours ago | parent | next [-] | |||||||
To professional engineers who have a duty towards public safety, it's not enough to build an unsafe footbridge and hang up a sign saying "cross at your own risk". It's certainly not enough to build a cheap, un-flight-worthy airplane and then say "but if this crashes, that's on the airline dumb enough to fly it". And it's very certainly not enough to put cars on the road with no working brakes, while saying "the duty of safety is on whoever chose to turn the key and push the gas pedal". For most of us, we do actually have to do better than that. But apparently not AI engineers? | ||||||||
| ||||||||
▲ | actsasbuffoon 17 hours ago | parent | prev [-] | |||||||
As far as responsibility goes, sure. But when companies push LLMs into decision-making roles, you could end up being hurt by this even if you’re not the responsible party. If you thought bureaucracy was dumb before, wait until the humans are replaced with LLMs that can be tricked into telling you how to make meth by asking them to role play as Dr House. |