Remix.run Logo
comex 2 days ago

It’s just a hallucination, same idea as o3 claiming that it uses its laptop to mine Bitcoin:

https://transluce.org/investigating-o3-truthfulness

I doubt the model was trained on Street View, but even if it was, LLMs don’t retain any “memory” of how/when they were trained, so any element of truthfulness would be coincidental.

geysersam 2 days ago | parent [-]

If it's trained on street view data it's not unlikely that the model can associate a particular piece of context to street view. For example, a picture can have telltale signs that street view content has, such as blurred faces and street signs, watermarks, etc.

Even if it's not directly trained on street view data it has probably encountered street view content in it's training dataset.