▲ | dansmith1919 5 days ago | |
I think they mean prompt injection rather than some malformed image to trigger a security bug in the processing library | ||
▲ | catlifeonmars 4 days ago | parent [-] | |
The LLM is the image processing library in this case so you are both right :) |