Remix.run Logo
ACCount37 a day ago

When you're training an AI, that "mere data" adds up. Random error averages out, getting closer to zero with every data point. Systematic error leaks information about the system that keeps making the error.

A Harry Potter book doesn't ruin an AI's world model by contaminating reality with fantasy. It gives it valuable data points on human culture and imagination and fiction tropes and commercially successful creative works. All of which is a part of the broader "reality" the AI is trying to grasp the shape of as it learns from the vast unstructured dataset.

nathan_douglas a day ago | parent | next [-]

You're absolutely correct, of course. I was musing during down time in a meeting and turned it into a joke instead of engaging my faculties :)

multjoy a day ago | parent | prev [-]

The AI learns nothing from Harry Potter other than the statistical likelihood of one token appearing after another.

The AI is trying to grasp nothing.

ACCount37 a day ago | parent [-]

Any sufficiently advanced statistical model is a world model.

If you think that what your own brain doing isn't fancy statistics plugged into a prediction engine, I have some news for you.