| ▲ | atomicnature 21 hours ago | |
You can look into Judea Pearl's definitions of causality for more information. Pearl defines a ladder of causation: 1. Seeing (association) 2. Doing (intervention) 3. Imagining (counterfactuals) In his view - most ML algos are at level 1 - they look at data and draw associations, and "agents" have started some steps in level 2 - doing. The smartest of humans operate mostly in level (3) of abstractions - where they see things, gain experience, and later build up a "strong causal model" of the world and become capable of answering "what if" questions. | ||