Fast convergence to non-isolated minima: four equivalent conditions for $${\textrm{C}^{2}}$$ functions
Maxima and minima
DOI:
10.1007/s10107-024-02136-6
Publication Date:
2024-09-19T14:21:10Z
AUTHORS (2)
ABSTRACT
AbstractOptimization algorithms can see their local convergence rates deteriorate when the Hessian at the optimum is singular. These singularities are inescapable when the optima are non-isolated. Yet, under the right circumstances, several algorithms preserve their favorable rates even when optima form a continuum (e.g., due to over-parameterization). This has been explained under various structural assumptions, including the Polyak–Łojasiewicz condition, Quadratic Growth and the Error Bound. We show that, for cost functions which are twice continuously differentiable ($$\textrm{C}^2$$C2), those three (local) properties are equivalent. Moreover, we show they are equivalent to the Morse–Bott property, that is, local minima form differentiable submanifolds, and the Hessian of the cost function is positive definite along its normal directions. We leverage this insight to improve local convergence guarantees for safe-guarded Newton-type methods under any (hence all) of the above assumptions. First, for adaptive cubic regularization, we secure quadratic convergence even with inexact subproblem solvers. Second, for trust-region methods, we argue capture can fail with an exact subproblem solver, then proceed to show linear convergence with an inexact one (Cauchy steps).
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (105)
CITATIONS (2)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....