Global Convergence of Algorithms Under Constant Rank Conditions for Nonlinear Second-Order Cone Programming

In [R. Andreani, G. Haeser, L. M. Mito, H. Ramírez C., Weak notions of nondegeneracy in nonlinear semidefinite programming, arXiv:2012.14810, 2020] the classical notion of nondegeneracy (or transversality) and Robinson’s constraint qualification have been revisited in the context of nonlinear semidefinite programming exploiting the structure of the problem, namely, its eigendecomposition. This allows formulating the … Read more

Two limited-memory optimization methods with minimum violation of the previous quasi-Newton equations

Limited-memory variable metric methods based on the well-known BFGS update are widely used for large scale optimization. The block version of the BFGS update, derived by Schnabel (1983), Hu and Storey (1991) and Vl·cek and Luk·san (2019), satis¯es the quasi-Newton equations with all used di®erence vectors and for quadratic objective functions gives the best improvement … Read more

Full-low evaluation methods for derivative-free optimization

We propose a new class of rigorous methods for derivative-free optimization with the aim of delivering efficient and robust numerical performance for functions of all types, from smooth to non-smooth, and under different noise regimes. To this end, we have developed Full-Low Evaluation methods, organized around two main types of iterations. The first iteration type … Read more

A novel approach for bilevel programs based on Wolfe duality

This paper considers a bilevel program, which has many applications in practice. To develop effective numerical algorithms, it is generally necessary to transform the bilevel program into a single-level optimization problem. The most popular approach is to replace the lower-level program by its KKT conditions and then the bilevel program can be transformed into a … Read more

A study of Liu-Storey conjugate gradient methods for vector optimization

This work presents a study of Liu-Storey (LS) nonlinear conjugate gradient (CG) methods to solve vector optimization problems. Three variants of the LS-CG method originally designed to solve single-objective problems are extended to the vector setting. The first algorithm restricts the LS conjugate parameter to be nonnegative and use a sufficiently accurate line search satisfying … Read more

Sequential constant rank constraint qualifications for nonlinear semidefinite programming with applications

We present new constraint qualification conditions for nonlinear semidefinite programming that extend some of the constant rank-type conditions from nonlinear programming. As an application of these conditions, we provide a unified global convergence proof of a class of algorithms to stationary points without assuming neither uniqueness of the Lagrange multiplier nor boundedness of the Lagrange … Read more

A globally trust-region LP-Newton method for nonsmooth functions under the Hölder metric subregularity

We describe and analyse a globally convergent algorithm to find a possible nonisolated zero of a piecewise smooth mapping over a polyhedral set, such formulation includes Karush-Kuhn-Tucker (KKT) systems, variational inequalities problems, and generalized Nash equilibrium problems. Our algorithm is based on a modification of the fast locally convergent Linear Programming (LP)-Newton method with a … Read more

Global convergence of Riemannian line search methods with a Zhang-Hager-type condition

In this paper, we analyze the global convergence of a general non–monotone line search method on Riemannian manifolds. For this end, we introduce some properties for the tangent search directions that guarantee the convergence, to a stationary point, of this family of optimization methods under appropriate assumptions. A modified version of the non–monotone line search … Read more

Minimization of L1 over L2 for sparse signal recovery with convergence guarantee

The ratio of the $L_1$ and $L_2$ norms, denoted by $L_1/L_2$, becomes attractive due to its scale-invariant property when approximating the $L_0$ norm to promote sparsity. In this paper, we incorporate the $L_1/L_2$ formalism into an unconstrained model in order to deal with both noiseless and noisy observations. To design an efficient algorithm, we derive … Read more

Economic inexact restoration for derivative-free expensive function minimization and applications

The Inexact Restoration approach has proved to be an adequate tool for handling the problem of minimizing an expensive function within an arbitrary feasible set by using different degrees of precision in the objective function. The Inexact Restoration framework allows one to obtain suitable convergence and complexity results for an approach that rationally combines low- … Read more