Remix.run Logo
candiddevmike 3 hours ago

Because, in my experience/conspiracy theory, the model providers are trying to make the models function better without having to have these kinds of workarounds. And so there's a disconnect where folks are adding more explicit instructions and the models are being trained to effectively ignore them under the guise of using their innate intuition/better learning/mixture of experts.