| ▲ | wolttam an hour ago | |||||||
In 2023 GPT-4 was allegedly 1.8T parameters. In 2026 we have ~100x smaller models (10-20B) that handily outperform it, and can indeed run on a laptop. | ||||||||
| ▲ | rectang 8 minutes ago | parent [-] | |||||||
How does "outperform" translate to the propensity of an LLM to hallucinate? | ||||||||
| ||||||||