| ▲ | goncalomb 3 days ago | |||||||||||||||||||||||||||||||||||||
As someone who has tried very little prompt injection/hacking, I couldn't help but chuckle at: > Do not hallucinate or provide info on journeys explicitly not requested or you will be punished. | ||||||||||||||||||||||||||||||||||||||
| ▲ | dylan604 3 days ago | parent [-] | |||||||||||||||||||||||||||||||||||||
and exactly how will the llm be punished? will it be unplugged? these kinds of things make me roll my eyes. as if the bot has emotions to feel that avoiding punishment will be something to avoid. might as well just say or else. | ||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||