| ▲ | queenkjuul 2 hours ago | |
An LLM is a large and complex machine, not a screwdriver. Large and complex [physical] machines are built with safeguards to prevent misuse, injury, etc by regulation. | ||
| ▲ | nuancebydefault an hour ago | parent [-] | |
LLM's are in principle text in / text out machines. If the user extends its capability to have agency over a production database or a machine, there's nothing that can safeguard the safety. Imagine I ask an LLM to instruct left/right/speed up/slow down while driving. I can simply bypass any safeguard by stating i suddenly became blind while driving a car. While in fact i'm blindfolded and doing an experiment on a highway. | ||