On the evaluation complexity of constrained nonlinear least-squares and general constrained nonlinear optimization using second-order methods

When solving the general smooth nonlinear optimization problem involving equality and/or inequality constraints, an approximate first-order critical point of accuracy $\epsilon$ can be obtained by a second-order method using cubic regularization in at most $O(\epsilon^{-3/2})$ problem-functions evaluations, the same order bound as in the unconstrained case. This result is obtained by first showing that the … Read more

Worst-case evaluation complexity of non-monotone gradient-related algorithms for unconstrained optimization

The worst-case evaluation complexity of finding an approximate first-order critical point using gradient-related non-monotone methods for smooth nonconvex and unconstrained problems is investigated. The analysis covers a practical linesearch implementation of these popular methods, allowing for an unknown number of evaluations of the objective function (and its gradient) per iteration. It is shown that this … Read more

On the complexity of the steepest-descent with exact linesearches

The worst-case complexity of the steepest-descent algorithm with exact linesearches for unconstrained smooth optimization is analyzed, and it is shown that the number of iterations of this algorithm which may be necessary to find an iterate at which the norm of the objective function’s gradient is less that a prescribed $\epsilon$ is, essentially, a multiple … Read more

How much patience do you have? A worst-case perspective on smooth nonconvex optimization

The paper presents a survey of recent results in the field of worst-case complexity of algorithms for nonlinear (and possibly nonconvex) smooth optimization. Both constrained and unconstrained case are considered. Article Download View How much patience do you have? A worst-case perspective on smooth nonconvex optimization

On the evaluation complexity of cubic regularization methods for potentially rank-deficient nonlinear least-squares problems and its relevance to constrained nonlinear optimization

We propose a new termination criteria suitable for potentially singular, zero or non-zero residual, least-squares problems, with which cubic regularization variants take at most $\mathcal{O}(\epsilon^{-3/2})$ residual- and Jacobian-evaluations to drive either the Euclidean norm of the residual or its gradient below $\epsilon$; this is the best-known bound for potentially singular nonlinear least-squares problems. We then … Read more

A Note About The Complexity Of Minimizing Nesterov’s Smooth Chebyshev-Rosenbrock Function

This short note considers and resolves the apparent contradiction between known worst-case complexity results for first and second-order methods for solving unconstrained smooth nonconvex optimization problems and a recent note by Jarre (2011) implying a very large lower bound on the number of iterations required to reach the solution’s neighbourhood for a specific problem with … Read more

Optimal Newton-type methods for nonconvex smooth optimization problems

We consider a general class of second-order iterations for unconstrained optimization that includes regularization and trust-region variants of Newton’s method. For each method in this class, we exhibit a smooth, bounded-below objective function, whose gradient is globally Lipschitz continuous within an open convex set containing any iterates encountered and whose Hessian is $\alpha-$Holder continuous (for … Read more

On the complexity of finding first-order critical points in constrained nonlinear optimization

The complexity of finding epsilon-approximate first-order critical points for the general smooth constrained optimization problem is shown to be no worse that O(epsilon^{-2}) in terms of function and constraints evaluations. This result is obtained by analyzing the worst-case behaviour of a first-order shorts-step homotopy algorithm consisting of a feasibility phase followed by an optimization phase, … Read more

On the evaluation complexity of composite function minimization with applications to nonconvex nonlinear programming

We estimate the worst-case complexity of minimizing an unconstrained, nonconvex composite objective with a structured nonsmooth term by means of some first-order methods. We find that it is unaffected by the nonsmoothness of the objective in that a first-order trust-region or quadratic regularization method applied to it takes at most O($\epsilon^{-2}$) function-evaluations to reduce the … Read more

Complexity bounds for second-order optimality in unconstrained optimization

This paper examines worst-case evaluation bounds for finding weak minimizers in unconstrained optimization. For the cubic regularization algorithm, Nesterov and Polyak (2006) and Cartis, Gould and Toint (2010) show that at most O(epsilon^{-3}) iterations may have to be performed for finding an iterate which is within epsilon of satisfying second-order optimality conditions. We first show … Read more