| ▲ | TeMPOraL 2 hours ago | |
> and being coerced or convinced to bypass rules that are still known to be rules I think remains uniquely human. This is literally what "prompt injection" is. The sooner people understand this, the sooner they'll stop wasting time trying to fix a "bug" that's actually the flip side of the very reason they're using LLMs in the first place. | ||