Remix.run Logo
soulofmischief 2 hours ago

It is not anthropomorphism. It is literally a prediction model and saying that a model "assumes" something is common parlance. This isn't new to neural models, this is a general way that we discuss all sorts of models from physical to conceptual.

And in the case of an LLM, walking a noncommutative path down a probabilistic knowledge manifold, it's incorrect to oversimplify the model's capabilities as simply parroting a training dataset. It has an internal world model and is capable of simulation.