A Subspace Minimization Barzilai-Borwein Method for Multiobjective Optimization Problems

Nonlinear conjugate gradient methods have recently garnered significant attention within the multiobjective optimization community. These methods aim to maintain consistency in conjugate parameters with their single-objective optimization counterparts. However, the preservation of the attractive conjugate property of search directions remains uncertain, even for quadratic cases, in multiobjective conjugate gradient methods. This loss of interpretability of … Read more

Strong global convergence properties of algorithms for nonlinear symmetric cone programming

Sequential optimality conditions have played a major role in proving strong global convergence properties of numerical algorithms for many classes of optimization problems. In particular, the way complementarity is dealt is fundamental to achieve a strong condition. Typically, one uses the inner product structure to measure complementarity, which gives a very general approach to a … Read more

Full-low evaluation methods for bound and linearly constrained derivative-free optimization

Derivative-free optimization (DFO) consists in finding the best value of an objective function without relying on derivatives. To tackle such problems, one may build approximate derivatives, using for instance finite-difference estimates. One may also design algorithmic strategies that perform space exploration and seek improvement over the current point. The first type of strategy often provides … Read more

Global convergence of a BFGS-type algorithm for nonconvex multiobjective optimization problems

We propose a modified BFGS algorithm for multiobjective optimization problems with global convergence, even in the absence of convexity assumptions on the objective functions. Furthermore, we establish the superlinear convergence of the method under usual conditions. Our approach employs Wolfe step sizes and ensures that the Hessian approximations are updated and corrected at each iteration … Read more

Constraint qualifications and strong global convergence properties of an augmented Lagrangian method on Riemannian manifolds

In the past years, augmented Lagrangian methods have been successfully applied to several classes of non-convex optimization problems, inspiring new developments in both theory and practice. In this paper we bring most of these recent developments from nonlinear programming to the context of optimization on Riemannian manifolds, including equality and inequality constraints. Many research have … Read more

Polyhedral Newton-min algorithms for complementarity problems

The semismooth Newton method is a very efficient approach for computing a zero of a large class of nonsmooth equations. When the initial iterate is sufficiently close to a regular zero and the function is strongly semismooth, the generated sequence converges quadratically to that zero, while the iteration only requires to solve a linear system. … Read more

A worst-case complexity analysis for Riemannian non-monotone line-search methods

In this paper we deal with non-monotone line-search methods to minimize a smooth cost function on a Riemannian manifold. In particular, we study the number of iterations necessary for this class of algorithms to obtain e-approximated stationary points. Specifically, we prove that under a regularity Lipschitz-type condition on the pullbacks of the cost function to … Read more

A Sequential Quadratic Programming Method for Optimization with Stochastic Objective Functions, Deterministic Inequality Constraints and Robust Subproblems

In this paper, a robust sequential quadratic programming method of Burke and Han (Math Programming, 1989)  for constrained optimization is generalized to problem with stochastic objective function, deterministic equality and inequality constraints. A stochastic line search scheme in Paquette and Scheinberg (SIOPT, 2020) is employed to globalize the steps. We show that in the case … Read more

A primal-dual majorization-minimization method for large-scale linear programs

We present a primal-dual majorization-minimization method for solving large-scale linear programs. A smooth barrier augmented Lagrangian (SBAL) function with strict convexity for the dual linear program is derived. The majorization-minimization approach is naturally introduced to develop the smoothness and convexity of the SBAL function. Our method only depends on a factorization of the constant matrix … Read more

A filter sequential adaptive cubic regularisation algorithm for nonlinear constrained optimization

In this paper, we propose a filter sequential adaptive regularisation algorithm using cubics (ARC) for solving nonlinear equality constrained optimization. Similar to sequential quadratic programming methods, an ARC subproblem with linearized constraints is considered to obtain a trial step in each iteration. Composite step methods and reduced Hessian methods are employed to tackle the linearized … Read more