▲ | Dylan16807 6 days ago | |||||||||||||||||||||||||
I'll try to keep this simple. > I'm not disagreeing with you. You understand that, right? We disagree about whether context can make a difference, right? > The parent was talking about stringing together inferences. My argument was how you string them together matters. That's all. I said "context matters." > TLDR: We can't determine if likelihood increases or decreases without additional context The situations you describe where inference acts differently do not fall under the "stringing together"/"chaining" they were originally talking about. Context never makes their original statement untrue. Chaining always makes evidence weaker. To be extra clear, it's not about whether the evidence pushes your result number up or down, it's that the likelihood of the evidence itself being correct drops. > It is the act of chaining together functions. They were not talking about whether something is composition or not. When they said "string" and "chain" they were talking about a sequence of inferences where each one leads to the next one. Composition can be used in a wide variety of contexts. You need context to know if composition weakens or strengthens arguments. But you do not need context to know if stringing/chaining weakens or strengthens. > No, you're being too strict in your definition of "chain". No, you're being way too loose. > This tells me you drew your chain wrong. If multiple things are each contributing to D independently then that is not A->B->C->D ??? Of course those are different. That's why I wrote "as opposed to". > I also gave an example for the other case. So why focus on one of these and ignore the other? I'm focused on the one you called a "counter example" because I'm arguing it's not an example. If you specifically want me to address "If these are being multiplied, then yes, this is going to decreases as xy < x and xy < y for every x,y < 1." then yes that's correct. I never doubted your math, and everyone agrees about that one. TL;DR: At this point I'm mostly sure we're only disagreeing about the definition of stringing/chaining? If yes, oops sorry I didn't mean to argue so much about definitions. If not, then can you give me an example of something I would call a chain where adding a step increases the probability the evidence is correct? And I have no idea why you're talking about LLMs. | ||||||||||||||||||||||||||
▲ | godelski 6 days ago | parent [-] | |||||||||||||||||||||||||
Correct.
Okay, instead of just making claims and for me to trust you, go point to something concrete. I've even tried to google, but despite my years of study in statistics, metric theory, and even mathematical logic I'm at a loss in finding your definition.I'm aware of the Chain Rule of Probability, but this isn't the only type you will find the term "chain" in statistics. Hell, the calculus Chain Rule is still used there too! So forgive me for being flustered but you are literally arguing to me that a Markov Chain isn't a chain. Maybe I'm having a stroke, but I'm pretty sure the word "chain" is in Markov Chain. | ||||||||||||||||||||||||||
|