| ▲ | ssivark 2 days ago | |
Uhhh... the above comment has a bunch of loose assertions that are not quite true, but with a enough truthiness that makes them hard to refute. So I'll point to my other comment for a more nuanced comparison of Markov models with tiny LLMs: https://news.ycombinator.com/item?id=45996794 | ||
| ▲ | nazgul17 a day ago | parent [-] | |
To add to this, the system offering text generation, i.e. the loop that builds the response one token at a time generated by a LLM (and at the same time feeds the LLM the text generated so far) is a Markov Model, where the transition matrix is replaced by the LLM, and the state space is the space of all texts. | ||