▲ | ux 13 hours ago | ||||||||||||||||
I'm interested in the approach you're describing but it's hard to follow a comment in the margin. Is there a paper or an implementation example somewhere? | |||||||||||||||||
▲ | GistNoesis 11 hours ago | parent [-] | ||||||||||||||||
The general technique is not recent I was taught it in school in global optimisation class more than 15 years ago. Here there is a small number of local minimum, the idea is to iterate over them in increasing order. Can't remember the exact name but here is a more recent paper proposing "Sequential Gradient Descent" https://arxiv.org/abs/2011.04866 which features a similar idea. Sequential convex programming : http://web.stanford.edu/class/ee364b/lectures/seq_notes.pdf There is not really something special to it, it just standard local non linear minimization techniques with constraints Sequential Least Squares Quadratic Programming (SLSQP). It's just about framing it as an optimization problem looking for "Points" with constraints and applying standard optimization toolbox, and recognizing which type of problem your specific problem is. You can write it as basic gradient descent if you don't care about performance. The problem of finding a minimum of a quadratic function inside a disk is commonly known as the "Trust Region SubProblem" https://cran.r-project.org/web/packages/trust/vignettes/trus... but in this specific case of distances to curve we are on the easy case of Positive Definite. | |||||||||||||||||
|