Remix.run Logo
getpokedagain 3 hours ago

We are anthropomorphizing whenever we refer to prompts as instructions to models. They predict text not obey our orders.

gigatree 2 hours ago | parent [-]

That’s not how language works, just how engineers think it works