Remix.run Logo
GistNoesis 14 hours ago

This is fundamentally a geometric problem and the author completely missed the geometry aspect by transforming everything into polynomials and root finding.

The naive generic way of finding distances from point P to curve C([0,1]) is a procedure quite standard for global minimization : repeat "find a local minimum on the space constrained to be better than any previous minimum"

- Find a point P0 local minimum of d(P,P0) subject to the constraint P0=C(t0) (aka P0 is on C)

- define d0 = d(P,P0)

- Bisect the curve at t1 into two curves C1 = C([0,t0]) and C2([t0,1])

- Find a point P1 local minimum of d(P,P1) subject to the constraint P1=C(t1) with 0< t1 < t0 (aka P1 on c1) and the additional constraint d(P,P1) < d0 (note here the inequality is strict so that we won't find P0 again) if it exist.

- define d1 = d(P,P1)

- Find a point P2 local minimum of d(P,P2) subject to the constraint P2=C(t2) with t0 < t2 < 1 (aka P2 on c2) and the additional constraint d(P,P2) < d0 (note here the inequality is strict so that we won't find P0 again) if it exist.

- define d2 = d(P,P2)

Here in the case of cubic Bezier the curve have only one loop so you don't need to bissect anymore. If curves where higher order like spirals, you would need to cascade the problems with a shrinking distance constraint (aka looking only for points in R^2 inside the circle

So the distance is min over all local minimum found = min( d0,d1,d2)

Here the local minimization problems are in R^n (and inside disks centered around P) with n the dimension of the space, and because this is numerical approximation, you can either use slack (dual) variables to find the tx which express the on Curve constraint or barrier methods to express the disk constraints once a t parametrization has been chosen.

Multivariate Newton is guaranteed to work because distances are positive so the Sequential Quadratic Programming problems are convex (scipy minimize "SLSQP"). (Whereas the author needed 5 polynomials root, you can only need to solve for 3 points because each solve solves for two coordinates).

A local minimum is a point which satisfy the KKT conditions.

This procedure is quite standard : it's for example use to find the eigenvalues of matrices iteratively. Or finding all solutions to a minimization problem.

Where this procedure shines, is when you have multiple splines, and you want to find the minimal distance to them : you can partition the space efficiently and not compute distances to part of curves which are to far away to have a chance to be a minimum. This will scale independently of the resolution of your chain spline. (imagine a spiral the number of local minimum you need to encounter are proportional to the complexity of the scene and not the resolution of the spiral)

But when you are speaking about computing the whole signed distance field, you should often take the step of solving the Eikonal equation over the space instead of computing individual distances.

ux 13 hours ago | parent [-]

I'm interested in the approach you're describing but it's hard to follow a comment in the margin. Is there a paper or an implementation example somewhere?

GistNoesis 11 hours ago | parent [-]

The general technique is not recent I was taught it in school in global optimisation class more than 15 years ago.

Here there is a small number of local minimum, the idea is to iterate over them in increasing order.

Can't remember the exact name but here is a more recent paper proposing "Sequential Gradient Descent" https://arxiv.org/abs/2011.04866 which features a similar idea.

Sequential convex programming : http://web.stanford.edu/class/ee364b/lectures/seq_notes.pdf

There is not really something special to it, it just standard local non linear minimization techniques with constraints Sequential Least Squares Quadratic Programming (SLSQP).

It's just about framing it as an optimization problem looking for "Points" with constraints and applying standard optimization toolbox, and recognizing which type of problem your specific problem is. You can write it as basic gradient descent if you don't care about performance.

The problem of finding a minimum of a quadratic function inside a disk is commonly known as the "Trust Region SubProblem" https://cran.r-project.org/web/packages/trust/vignettes/trus... but in this specific case of distances to curve we are on the easy case of Positive Definite.

ux 11 hours ago | parent [-]

What you described in your first message seemed similar to the approach used in the degree N root solving algorithm by Cem Yuksel; splitting the curve in simpler segments, then bisect into them. I'd be happy to explore what you suggested, but I'm not mathematically literate, so I'll be honest with you; what you're saying here is complete gibberish to me, and it's very hard to follow your point. It will take me weeks to figure out your suggestion and make a call as to whether it's actually simpler or more performant than what is proposed in the article.

GistNoesis 8 hours ago | parent [-]

I have written some gist to illustrate the approach I suggest. The code run but there may be bugs, and it don't use the appropriate optimizer. The purpose is to illustrate the optimisation approach.

https://gist.github.com/unrealwill/1ad0e50e8505fd191b617903b...

Point 33 "intersection between bezier curve with a circle" may be useful to find the feasible regions of the subproblems https://pomax.github.io/bezierinfo/#introduction

The approach I suggest will need more work, and there are probably problematic edge cases to consider and numerical stability issues. Proper proofs have not been done. It's typically some high work-low reward situation.

It's mostly interesting because it highlight the link between roots and local mimimum. And because it respect the structure of the problem more.

To find roots we can find a first root then divide the polynomial by (x-root). And find a root again.

If you are not mathematically literate, it'll probably be hard to do the details necessary to make it performant. But if you use a standard black-box optimizer with constraints it should be able to do it in few iterations.

You can simplify the problem by considering piece-wise segments instead of splines. The extension to chains of segment is roughly the same, and the spatial acceleration structure based on branch-and-bound are easier.