Remix.run Logo
NotAnOtter 4 days ago

Your overall point might be correct but your example does not prove your point.

A Makrov chain is just the path taken through the course of a markov process. The terms 'chain' and 'process' are sometimes conflated in this context, but this is the most common distinction I've seen. As such you can run a markov process for some number of steps N times, and then ask how many generated chains contain the property you are interested in. The process is memoryless but the chain is the result of the process and therefor contains memory.

I agree 'Random walk' is a superset of 'a markov process', but IMO when someone says Random walk - they normally make assumptions that qualify it as a markov chain. Therefor it's useful as a teaching to just call it a random walk.

seanhunter 4 days ago | parent [-]

Interesting. I'd not heard the term Markov Chain used to describe a path so I checked my copy of "All of Statistics" by Larry Wasserman and he (slightly) disagrees with both of us and says

    "A Markov chain is a stochastic process for which the distribution of X_t depends only on X_{t-1}."
So he doesn't need it to be a discrete process but he also doesn't think it's a path. I guess the terminology is not 100% standardized. But anyhow thanks for making me think about this. Always interesting.
JohnKemeny 4 days ago | parent | next [-]

But a random walk is precisely a stochastic process for which the _next state_ depends only on the _current state_. In terms of graphs (where _random walk_ comes from), the next node is decided by randomly selecting a neighbor of the current node.

seanhunter 3 days ago | parent | next [-]

That's not true for random walks in general I don't think. A random walk is a process derived from taking random steps in some mathematical space. It can include jumps and it can include memory.

Take a "path-avoiding" random walk. At time t the distribution of the next step depends on whether or not I have at some point hit any of the adjacent nodes in the current path. That's not the current state, that's memory.

JohnKemeny 3 days ago | parent | next [-]

A random walk on a graph is a stochastic process that starts at a vertex and, at each step, moves to a randomly chosen neighboring vertex. Formally:

Given a graph, a random walk is a sequence of vertices [v1, v2, ..., vk] such that each v{i+1} is selected uniformly at random from the neighbors of vi.

In weighted graphs, the next vertex is chosen with probability proportional to edge weights.

seanhunter 3 days ago | parent [-]

I’m pretty sure it’s not a requirement that the distribution is uniform and also not path-dependent as per the example I gave - a random walk where you’re not allowed to visit a node more than once.

JohnKemeny 3 days ago | parent [-]

Here's a reference for the definition I'm using:

https://www.cs.yale.edu/homes/spielman/561/lect10-18.pdf

It's from lecture notes (pdf) from a course in Spectral Graph Theory by Professor Daniel Spielman at Yale.

seanhunter 3 days ago | parent [-]

Those notes look interesting thanks. I’ve really never heard of someone saying you have to have uniform probability for a random walk on a graph. In fact the context where I’m most familiar with them (lattice/grid pricers) it’s always specified something like “the probability of branch a is p and b is 1-p” (ie explicitly not uniform) and they aren’t a weighted graph in the normal sense.

srean 3 days ago | parent [-]

It doesn't have to be uniform or memory less.

However, in general when people mention random walks without further qualifications they are usually talking about the uniform and memoryless case.

That vanilla case can be generalized extensively, on kinds of state spaces, kinds of index sets, kinds of dependence and so on.

seanhunter 3 days ago | parent [-]

That makes sense. THanks.

vladimirralev 3 days ago | parent | prev [-]

But also "the memory" of the random walk can be encoded in a state itself. In your example you can just keep accumulating the visited nodes, so your state space will be now the space of tuples of nodes from your initial space.

4 days ago | parent | prev [-]
[deleted]
jmaker 4 days ago | parent | prev [-]

A Markov chain is commonly understood to be a time-discrete Markov process. Intuitively, it’s a “chain” because you can “single out” its states in time rather than intervals. That’s also Markov’s original definition. Instead of Wassermann one can look up the notion in Wikipedia. A path is a notion relevant to the state space of a process - it’s a realization of states at every single point in time for the time span of interest.