An Inexact Regularized Newton Framework with a Worst-Case Iteration Complexity of $\mathcal{O}(\epsilon^{-3/2})$ for Nonconvex Optimization

An algorithm for solving smooth nonconvex optimization problems is proposed that, in the worst-case, takes $\mathcal{O}(\epsilon^{-3/2})$ iterations to drive the norm of the gradient of the objective function below a prescribed positive real number $\epsilon$ and can take $\mathcal{O}(\epsilon^{-3})$ iterations to drive the leftmost eigenvalue of the Hessian of the objective above $-\epsilon$. The proposed … Read more

Large-scale packing of ellipsoids

The problem of packing ellipsoids in the n-dimensional space is considered in the present work. The proposed approach combines heuristic techniques with the resolution of recently introduced nonlinear programming models in order to construct solutions with a large number of ellipsoids. Numerical experiments illustrate that the introduced approach delivers good quality solutions with a computational … Read more

A note on preconditioning weighted linear least squares, with consequences for weakly-constrained variational data assimilation

The effect of preconditioning linear weighted least-squares using an approximation of the model matrix is analyzed, showing the interplay of the eigenstructures of both the model and weighting matrices. A small example is given illustrating the resulting potential inefficiency of such preconditioners. Consequences of these results in the context of the weakly-constrained 4D-Var data assimilation … Read more

Worst-case evaluation complexity and optimality of second-order methods for nonconvex smooth optimization

We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex optimization from the point of view of worst-case evaluation complexity, improving and generalizing the results of Cartis, Gould and Toint (2010,2011). To this aim, we consider a new general class of inexact second-order algorithms for unconstrained optimization that includes regularization and trust-region … Read more

On the use of the saddle formulation in weakly-constrained 4D-VAR data assimilation

This paper discusses the practical use of the saddle variational formulation for the weakly-constrained 4D-VAR method in data assimilation. It is shown that the method, in its original form, may produce erratic results or diverge because of the inherent lack of monotonicity of the produced objective function values. Convergent, variationaly coherent variants of the algorithm … Read more

A convergence frame for inexact nonconvex and nonsmooth algorithms and its applications to several iterations

In this paper, we consider the convergence of an abstract inexact nonconvex and nonsmooth algorithm. We promise a pseudo sufficient descent condition and a pseudo relative error condition, which both are related to an auxiliary sequence, for the algorithm; and a continuity condition is assumed to hold. In fact, a wide of classical inexact nonconvex … Read more

Worst-case convergence analysis of gradient and Newton methods through semidefinite programming performance estimation

We provide new tools for worst-case performance analysis of the gradient (or steepest descent) method of Cauchy for smooth strongly convex functions, and Newton’s method for self-concordant functions. The analysis uses semidefinite programming performance estimation, as pioneered by Drori en Teboulle [Mathematical Programming, 145(1-2):451–482, 2014], and extends recent performance estimation results for the method of … Read more

Inner Conditions for Error Bounds and Metric Subregulerity of Multifunctions

We introduce a new class of sets, functions and multifunctions which is shown to be large and to enjoy some nice common properties with the convex setting. Error bounds for objects attached to this class are characterized in terms of inner conditions of Abadie’s type, that is conditions bearing on normal cones and coderivatives at … Read more

Iteratively Linearized Reweighted Alternating Direction Method of Multipliers for a Class of Nonconvex Problems

In this paper, we consider solving a class of nonconvex and nonsmooth problems frequently appearing in signal processing and machine learning research. The traditional alternating direction method of multipliers encounters troubles in both mathematics and computations in solving the nonconvex and nonsmooth subproblem. In view of this, we propose a reweighted alternating direction method of … Read more

Convergence Analysis of Processes with Valiant Projection Operators in Hilbert Space

Convex feasibility problems require to find a point in the intersection of a finite family of convex sets. We propose to solve such problems by performing set-enlargements and applying a new kind of projection operators called valiant projectors. A valiant projector onto a convex set implements a special relaxation strategy, proposed by Goffin in 1971, … Read more