Comment by GistNoesis
Comment by GistNoesis 12 hours ago
The general technique is not recent I was taught it in school in global optimisation class more than 15 years ago.
Here there is a small number of local minimum, the idea is to iterate over them in increasing order.
Can't remember the exact name but here is a more recent paper proposing "Sequential Gradient Descent" https://arxiv.org/abs/2011.04866 which features a similar idea.
Sequential convex programming : http://web.stanford.edu/class/ee364b/lectures/seq_notes.pdf
There is not really something special to it, it just standard local non linear minimization techniques with constraints Sequential Least Squares Quadratic Programming (SLSQP).
It's just about framing it as an optimization problem looking for "Points" with constraints and applying standard optimization toolbox, and recognizing which type of problem your specific problem is. You can write it as basic gradient descent if you don't care about performance.
The problem of finding a minimum of a quadratic function inside a disk is commonly known as the "Trust Region SubProblem" https://cran.r-project.org/web/packages/trust/vignettes/trus... but in this specific case of distances to curve we are on the easy case of Positive Definite.
What you described in your first message seemed similar to the approach used in the degree N root solving algorithm by Cem Yuksel; splitting the curve in simpler segments, then bisect into them. I'd be happy to explore what you suggested, but I'm not mathematically literate, so I'll be honest with you; what you're saying here is complete gibberish to me, and it's very hard to follow your point. It will take me weeks to figure out your suggestion and make a call as to whether it's actually simpler or more performant than what is proposed in the article.