| ▲ | iLoveOncall a day ago | |
> If you allow an LLM to edit your code and also give it access to untrusted data (like the Internet), you have a security problem. You don't even need to give it access to Internet to have issues. The training data is untrusted. It's a guarantee that bad actors are spreading compromised code to infect the training data of future models. | ||