Remix.run Logo
oezi 2 days ago

The anthropomorphisation certainly is weird. But the technical aspect seems even weirder. Did OpenAI really build dedicated tools to have their models train on Google Street View? Or do they have generic technology for browsing complex sites like Street view?

comex 2 days ago | parent [-]

It’s just a hallucination, same idea as o3 claiming that it uses its laptop to mine Bitcoin:

https://transluce.org/investigating-o3-truthfulness

I doubt the model was trained on Street View, but even if it was, LLMs don’t retain any “memory” of how/when they were trained, so any element of truthfulness would be coincidental.

geysersam 2 days ago | parent [-]

If it's trained on street view data it's not unlikely that the model can associate a particular piece of context to street view. For example, a picture can have telltale signs that street view content has, such as blurred faces and street signs, watermarks, etc.

Even if it's not directly trained on street view data it has probably encountered street view content in it's training dataset.

namaria 14 hours ago | parent | next [-]

The training process doesn't preserve information needed for the LLM to infer that. It cannot be anything other than nonsense that sounds plausible, which is what they do best.

oezi 2 days ago | parent | prev [-]

I think the test which the OP performed (to pick a random street view and let it pinpoint it) would indicate that it has ingested some kind of information in this regard in a structured manner.