Remix.run Logo
ryang2718 3 days ago

I find it helpful to view least as fitting the noise to a Gaussian distribution.

MontyCarloHall 3 days ago | parent | next [-]

They both fit Gaussians, just different ones! OLS fits a 1D Gaussian to the set of errors in the y coordinates only, whereas TLS (PCA) fits a 2D Gaussian to the set of all (x,y) pairs.

ryang2718 3 days ago | parent [-]

Well, that was a knowledge gap, thank you! I certainly need to review PCA but python makes it a bit too easy.

LudwigNagasena 3 days ago | parent | prev | next [-]

OLS estimator is the minimum-variance linear unbiased estimator even without the assumption of Gaussian distribution.

rjdj377dhabsn 3 days ago | parent [-]

Yes, and if I remember correctly, you get the Gaussian because it's the minimum entropy (least additional assumptions about the shape) continuous distribution given a certain variance.

porridgeraisin 3 days ago | parent [-]

And given a mean.

contravariant 3 days ago | parent | prev [-]

Both of these do, in a way. They just differ in which gaussian distribution they're fitting to.

And how I suppose. PCA is effectively moment matching, least squares is max likelihood. These correspond to the two ways of minimizing the Kullback Leibler divergence to or from a gaussian distribution.