▲ | codyvoda a day ago | ||||||||||||||||||||||||||||||||||||||||||||||
sure but we’re talking about literal text, not physical drugs or bomb making materials. censorship is silly for LLMs and “jailbreaking” as a concept for LLMs is silly. this entire line of discussion is silly | |||||||||||||||||||||||||||||||||||||||||||||||
▲ | kennywinker a day ago | parent [-] | ||||||||||||||||||||||||||||||||||||||||||||||
Except it’s not, because people are using LLMs for things, thinking they can put guardrails on them that will hold. As an example, I’m thinking of the car dealership chatbot that gave away $1 cars: https://futurism.com/the-byte/car-dealership-ai If these things are being sold as things that can be locked down, it’s fair game to find holes in those lockdowns. | |||||||||||||||||||||||||||||||||||||||||||||||
|