Remix.run Logo
lukan 3 hours ago

"The model (opus 4.6) just could not stop drawing conclusions from obsolete regulations in its training dataset"

To be fair, humans are also often like this. If some rule/law/model was deeply ingrained into them, they often cannot stop thinking in terms of that rule, even if they are clearly in a new context (like a new country).

xyzal 3 hours ago | parent [-]

When the mandatory speed limit in my country was reduced from 60km/h to 50km/h in cities, 95 percent of people instantly adapted.

lukan 3 hours ago | parent [-]

But that is pretty much the same rule, just the numbers slightly adjusted. What do you think would happen if they changed traffic from the right to the left lane?

xyzal 3 hours ago | parent [-]

Heh, that would be surely funny :) But most people at least know there is a new permit law and if they are not sure, they are to seek expert guidance. The model is even with explicit notification unable to reflect upon this fact. How it is supposed to be useful then?

lukan 3 hours ago | parent [-]

Oh, most people would know in theory for sure, but if they go into driving, habit would kick in and they end up on the wrong lane pretty quickly.

At least that is what happened to me in australia and I only had a year of driving practice back then, but driving on the right side was already deeply ingrained and I had to be really aware of what I did.

But to be clear, I am not arguing models have real understanding of anything - I know they don't. My point was humans can be similar in pretending to have understood something, but if their core was modeled different, they will fall into old patterns again quickly.