Remix.run Logo
wavemode a day ago

It may actually have the opposite effect - the instruction to not use prior knowledge may have been what caused Gemini 3 to assume incorrect details about how certain puzzles worked and get itself stuck for hours. It knew the right answer (from some game walkthrough in its training data), but intentionally went in a different direction in order to pretend that it didn't know. So, paradoxically, the results of the test end up worse than if the model truly didn't know.