Remix.run Logo
goosedragons 3 days ago

It can be both. A mistake in AD primitives can lead to theoretically incorrect derivatives. With the system I use I have run into a few scenarios where edge cases aren't totally covered leading to the wrong result.

I have also run into numerical instability too.

froobius 3 days ago | parent [-]

> A mistake in AD primitives can lead to theoretically incorrect derivatives

Ok but that's true of any program. A mistake in the implementation of the program can lead to mistakes in the result of the program...

goosedragons 3 days ago | parent | next [-]

That's true! But it's also true that any program dealing with floats can run into numerical instability if care isn't taken to avoid it, no?

It's also not necessarily immediately obvious that the derivatives ARE wrong if the implementation is wrong.

srean 3 days ago | parent | next [-]

> It's also not necessarily immediately obvious that the derivatives ARE wrong if the implementation is wrong.

It's neither full proof or fool proof but an absolute must is a check that the loss function is reducing. It quickly detects a common error that the sign came out wrong in my gradient call. Part of good practice one learns in grad school.

froobius 3 days ago | parent | prev [-]

You can pretty concretely and easily check that the AD primatives are correct by comparing them to numerical differentiation.

godelski 3 days ago | parent | prev [-]

I haven't watched the video but the text says they're getting like 60+% error on simple linear ODEs which is pretty problematic.

You're right, but the scale of the problem seems to be the issue