▲ | joe_the_user 3 days ago | |
So the "Bitter Lesson" paper actually came up recently and I was surprised to discover that what it claimed was sensible and not at "all you need is data" or "data is inherently better" The first line and the conclusion is: "The biggest lesson that can be read from 70 years of AI research is that general methods that leverage computation are ultimately the most effective, and by a large margin." [1] I don't necessary agree with it's examples or the direction it vaguely points at. But it's basic statement seems sound. And I would say that there's lot of opportunity for engineer, broadly speaking, in the process of creating "general methods that leverage computation" (IE, that scale). What the bitter lesson page was roughly/really about was earlier "AI" methods based on logic-programming and which including information on the problem domain in the code itself. And finally, the "engineering" the paper talks about actually is pro-Bitter lesson as far as I can tell. It's taking data routing and architectural as "engineering" and here I agree this won't work - but for the opposite reason - specifically 'cause I don't just data routing/process will be enough. [1]https://www.cs.utexas.edu/~eunsol/courses/data/bitter_lesson... |