Remix.run Logo
famouswaffles 13 hours ago

I did not even remember you and had to dig to find out what you were on about. Just a heads up, if you've had a previous argument and you want to bring that up later then just speak plainly. Why act like "somebody" is anyone but you?

My response to both of you is the same.

LLMs do depend on previous events, but you say they don't because you've redefined state to include previous events. It's a circular argument. In a Markov chain, state is well defined, not something you can insert any property you want to or redefine as you wish.

It's not my fault neither of you understand what the Markov property is.

chpatrick 9 hours ago | parent [-]

By that definition n-gram Markov chain text generators also include previous state because you always put the last n grams. :) It's exactly the same situation as LLMs, just with higher, but still fixed n.

famouswaffles 7 hours ago | parent | next [-]

We've been through this. The context of a LLM is not fixed. Context windows =/ n gram orders.

They don't because n gram orders are too small and rigid to include the history in the general case.

I think srean's comment up the thread is spot on. This current situation where the state can be anything you want it to be just does not make a productive conversation.

8 hours ago | parent | prev [-]
[deleted]