Remix.run Logo
FilosofumRex 6 days ago

Historical fact: Differentiable programming was a little known secret back in the 90's, used mainly by engineers simulating numerically stiff systems like nukes and chemicals in FORTRAN 95. It then disappeared for nearly 30 yrs before rediscovery by the ML/AI researchers!

taeric 5 days ago | parent | next [-]

Computer Algebra Systems (CAS) were not really a secret. And they often have many many tricks that we are constantly relearning. Some of this relearning, of course, is by design. A lot of what they do are things we teach. How to calculate different functions and such. Repeated squares and such are fun topics.

A lot of the current new set of learning is that we have the compute power to do these things in more places. It is also something that has been long done in expensive environments that many of us just don't have access to.

kxyvr 5 days ago | parent | prev | next [-]

Automatic differentiation was actively and continuously used in some communities for the last 40 years. Louis Rall has an entire book about it published in 1981. One of the more popular books on AD written by Griewank was published in 2000. I learned about it in university in the early 2000s. I do agree that the technology was not as well used as it should have been until more recently, but the technology was well known within numerical math world and used continuously over the years.

constantcrying 5 days ago | parent | prev | next [-]

It wasn't forgotten, I learned it in university outside of any AI context. It just had most of its applications exhausted and ceased being a particularly active research topic.

thechao 5 days ago | parent | prev [-]

My PhD dissertation included a chapter (originally from a 2006 paper) on generic programming in CAS' on the algorithmic differentiable ring (and operator). By the 1990s, algorithmic differentiation was easily 30–40 years old. Griewank & Monagan both knew guys who had built early electromagnetic naval targeting computers that used the methodology back by at least the early 60s "by hand". (Very literally.)

I watched the ML/AI bros actively ignore previous research — even when they were requested to properly cite sources they were plagiarizing — in real time. The race to publish (even for big journals) was so important that it was easier to ignore the rank dishonesty than it was to correct their misbehavior. I'm 1000x happier to not have stayed around for all that crap.

srean 5 days ago | parent [-]

> on the algorithmic differentiable ring (and operator)

That sounds like an interesting read. Do you have the chapter or the reference to the paper that you can share ?

Regarding th crop of deep neural network research their self-serving and willful blindness has a reputation that's well deserved.

A grad student from Hinton's lab mentioned one researcher who would misspell a citation on purpose so that the citation count of the cited paper does not go up.

thechao 4 days ago | parent [-]

My PhD, and its chapters are junk. Just read Griewank; I don't have anything more to say than his monograph. (We could never generalize the AD operator to the hybrid mode, so it's useless for real workloads.)

srean 4 days ago | parent [-]

> My PhD, and its chapters are junk.

I think most people think so about their own dissertation. I have Griewank.