Remix.run Logo
falcor84 4 hours ago

But that's how ML works - as long as the output can be differentiated, we can utilize gradient descent to optimize the difference away. Eventually, the difference will be imperceptible.

And of course that brings me back to my favorite xkcd - https://xkcd.com/810/

emp17344 4 hours ago | parent [-]

Gradient descent is not a magic wand that makes computers behave like anything you want. The difference is still quite perceptible after several years and trillions of dollars in R&D, and there’s no reason to believe it’ll get much better.

falcor84 2 hours ago | parent [-]

Really, there's "no reason"? For me, watching ML gradually get better at every single benchmark thrown against it is quite a good reason. At this stage, the burden of proof is clearly on those who say it'll stop improving.