| ▲ | foolswisdom 2 hours ago | |
This reminds of the conversation the other day about the deleted production database at railway. "this person obviously didn't follow best practice of being hyper distrusting of LLM agents", and the response "yeah but every company is marketing it as safe. someone is gonna fall for it". | ||
| ▲ | kryogen1c an hour ago | parent [-] | |
(Well-regulated) free markets are sort of built on the principle of educated consumerism. Your choice matters; its not up to the government to make illegal every non-optimal product. However, we do expect some minimum level of safety. What does that mean for llms? Their nondeterminism does seem to incline them toward a legal safety requirement. Can you buy a fire extinguisher that 1/1000 times burns your house down? Or can your car brakes instead increase acceleration in rare cases? Im using llms much more than i used to, but i still cant shake the fundamental stochastic nature of the technology. | ||