▲ | godelski 6 days ago | |
I think the big insight was how useful this low order method still is. I think many people don't appreciate how new the study of high dimensional mathematics (let alone high dimensional statistics) actually is. I mean metric theory didn't really start till around the early 1900's. The big reason these systems are still mostly black boxes is because we still have a long way to go when it comes to understanding these spaces.But I think it is worth mentioning that low order approximations can still lock you out of different optima. While I agree the (Latent) Manifold Hypothesis pretty likely applies to many problems, this doesn't change the fact that even at relatively low dimensions (like 10D) are quite complex and have lots of properties that are unintuitive. With topics like language and images, I think it is safe to say that these still require operating in high dimensions. You're still going to have to contend with the complexities of the concentration of measure (an idea from the 70's). Still, I don't think anyone expected things to have worked out as well as they have. If anything I think it is more surprising we haven't run into issues earlier! I think there are still some pretty grand problems for AI/ML left. Personally this is why I push back against much of the hype. The hype machine is good if the end is in sight. But a hype machine creates a bubble. The gamble is if you call fill the bubble before it pops. But the risk is that if it pops before then, then it all comes crashing down. It's been a very hot summer but I'm worried that the hype will lead to a winter. I'd rather have had a longer summer than a hotter summer and a winter. |