▲ | godelski 4 days ago | ||||||||||||||||
You want me to show that it is trivially false that all Neural Networks are not Markov Chains? I mean we could point to a RNN which doesn't have the Markov Property. I mean another trivial case is when the rows do not sum to 1. I mean the internal states of neural networks are not required to be probability distributions. In fact, this isn't a requirement anywhere in a neural network. So whatever you want to call the transition matrix you're going to have issues. Or the inverse of this? That all Markov Chains are Neural Networks? Sure. Well sure, here's my transition matrix [1]. I'm quite positive an LLM would be able to give you more examples.
It's pretty clear you did not get your PhD in ML.
I think you're misunderstanding. Maybe I'm misunderstanding. But I'm failing to understand why you're jumping to the CDF. I also don't understand why this answers my question since there are other ways to sample from a distribution knowing only its CDF and without using the uniform distribution. I mean you can always convert to the uniform distribution and there's lots of tricks to do that. Or I mean the distribution in that SO post is the Rayleigh Distribution so we don't even need to do that. My question was not about that uniform is clean, but that it is a requirement. But this just doesn't seem relevant at all. | |||||||||||||||||
▲ | measurablefunc 3 days ago | parent [-] | ||||||||||||||||
Either find the exact error in the proof or stop running around in circles. The proof is very simple so if there is an error in any of it you should be able to find one very easily but you haven't done that. You have only asked for unrelated clarifications & gone on unrelated tangents. | |||||||||||||||||
|