| ▲ | sigmoid10 10 hours ago | |||||||
Most people have no clue how these things really work and what they can do. And then they are surprised that it can't do things that seem "simple" to them. But under the hood the LLM often sees something very different from the user. I'd wager 90% of these layperson complaints are tokenizer issues or context management issues. Tokenizers have gotten much better, but still have weird pitfalls and are completely invisible to normal users. Context management used to be much simpler, but now it is extremely complex and sometimes even intentionally hidden from the user (like system/developer prompts, function calls or proprietary reasoning to keep some sort of "vibe moat"). | ||||||||
| ▲ | imiric 8 hours ago | parent [-] | |||||||
> Most people have no clue how these things really work and what they can do. Primarily because the way these things really work has been buried under a mountain of hype and marketing that uses misleading language to promote what they can hypothetically do. > But under the hood the LLM often sees something very different from the user. As a user, I shouldn't need to be aware of what happens under the hood. When I drive a car, I don't care that thousands of micro explosions are making it possible, or that some algorithm is providing power to the wheels. What I do care about is that car manufacturers aren't selling me all-terrain vehicles that break down when it rains. | ||||||||
| ||||||||