| ▲ | famouswaffles 13 hours ago | |||||||||||||
I did not even remember you and had to dig to find out what you were on about. Just a heads up, if you've had a previous argument and you want to bring that up later then just speak plainly. Why act like "somebody" is anyone but you? My response to both of you is the same. LLMs do depend on previous events, but you say they don't because you've redefined state to include previous events. It's a circular argument. In a Markov chain, state is well defined, not something you can insert any property you want to or redefine as you wish. It's not my fault neither of you understand what the Markov property is. | ||||||||||||||
| ▲ | chpatrick 9 hours ago | parent [-] | |||||||||||||
By that definition n-gram Markov chain text generators also include previous state because you always put the last n grams. :) It's exactly the same situation as LLMs, just with higher, but still fixed n. | ||||||||||||||
| ||||||||||||||