Remix.run Logo
dahart 2 days ago

> AI may output certain things at a vastly different rate than it appears in the training data

That’s a subjective statement, but generally speaking, not true. If it were, LLMs would produce unintelligible text & images. The way neural networks function is fundamentally to produce data that is statistically similar to the training data. Context, prompts, and training data are what drive the style. Whatever trends you believe you’re seeing in AI can be explained by context, prompts, and training data, and isn’t an inherent part of AI.

Extra fingers are known as hallucination, so if it’s a different phenomenon, then nobody knows what you’re talking about, and you are saying your analogy to fingers doesn’t work. In the case of images, the tokens are pixels, while in the case of LLMs, the tokens are approximately syllables. Finger hallucinations are lack of larger structural understanding, but they statistically mimic the inputs and are not examples of frequency differences.