Remix.run Logo
zahlman 6 hours ago

> Safety alignment relies almost entirely on the presence of the chat template.

Why is this a vulnerability? That is, why would the system be allowing you to communicate with the LLM directly, without putting your content into the template?

This reads a lot to me like saying "SQL injection is possible if you take the SQL query as-is from user input". There's so much potential for prompt injection that others have already identified despite this kind of templating that I hardly see the value in pointing out what happens without it.