▲ | bethekind 3 days ago | |
> the behavior of a complex system can simplify as it passes from one state to another. “Sometimes a high-dimensional system can tip,” said Lenton, “and when it gets near tipping, it starts to behave like a much lower-dimensional system.” The lesson, he added, echoes the one learned at Peter Lake: to “simplify without oversimplifying.” Sounds a lot like what people WANT neural networks to do. Collapse a high dimensional situation into a very low dimensional network. Ideally a binary answer, yes or no. I wonder if this bifurcation chaos math has implications in ml work |