A conjugate gradient-based algorithm for large-scale quadratic programming problem with one quadratic constraint

In this paper, we consider the nonconvex quadratically constrained quadratic programming (QCQP) with one quadratic constraint. By employing the conjugate gradient method, an efficient algorithm is proposed to solve QCQP that exploits the sparsity of the involved matrices and solves the problem via solving a sequence of positive definite system of linear equations after identifying … Read more

Design, Implementation and Simulation of an MPC algorithm for Switched Nonlinear Systems under Combinatorial Constraints

Within this work, we present a warm-started algorithm for Model Predictive Control (MPC) of switched nonlinear systems under combinatorial constraints based on Combinatorial Integral Approximation (CIA). To facilitate high-speed solutions, we introduce a preprocessing step for complexity reduction of CIA problems, and include this approach within a new toolbox for solution of CIA problems with … Read more

Finite convergence and weak sharpness for solutions of nonsmooth variational inequalities in Hilbert spaces

This paper deals with the study of weak sharp solutions for nonsmooth variational inequalities and finite convergence property of the proximal point method. We present several characterizations for weak sharpness of the solutions set of nonsmooth variational inequalities without using the gap functions. We show that under weak sharpness of the solutions set, the sequence … Read more

A stochastic Levenberg-Marquardt method using random models with complexity results and application to data assimilation

Globally convergent variants of the Gauss-Newton algorithm are often the methods of choice to tackle nonlinear least-squares problems. Among such frameworks, Levenberg-Marquardt and trust-region methods are two well-established, similar paradigms. Both schemes have been studied when the Gauss-Newton model is replaced by a random model that is only accurate with a given probability. Trust-region schemes … Read more

First-order methods for the impatient: support identification in finite time with convergent Frank-Wolfe variants

In this paper, we focus on the problem of minimizing a non-convex function over the unit simplex. We analyze two well-known and widely used variants of the Frank-Wolfe algorithm and first prove global convergence of the iterates to stationary points both when using exact and Armijo line search. Then we show that the algorithms identify … Read more

A Unified Characterization of Nonlinear Scalarizing Functionals in Optimization

Over the years, several classes of scalarization techniques in optimization have been introduced and employed in deriving separation theorems, optimality conditions and algorithms. In this paper, we study the relationships between some of those classes in the sense of inclusion. We focus on three types of scalarizing functionals (by Hiriart-Urruty, Drummond and Svaiter, Gerstewitz) and … Read more

The Standard Pessimistic Bilevel Problem

Pessimistic bilevel optimization problems, as optimistic ones, possess a structure involving three interrelated optimization problems. Moreover, their finite infima are only attained under strong conditions. We address these difficulties within a framework of moderate assumptions and a perturbation approach which allow us to approximate such finite infima arbitrarily well by minimal values of a sequence … Read more

An inexact strategy for the projected gradient algorithm in vector optimization problems on variable ordered spaces

Variable order structures model situations in which the comparison between two points depends on a point-to-cone map. In this paper, an inexact projected gradient method for solving smooth constrained vector optimization problems on variable ordered spaces is presented. It is shown that every accumulation point of the generated sequence satisfies the first order necessary optimality … Read more

Quasi-Newton approaches to Interior Point Methods for quadratic problems

Interior Point Methods (IPM) rely on the Newton method for solving systems of nonlinear equations. Solving the linear systems which arise from this approach is the most computationally expensive task of an interior point iteration. If, due to problem’s inner structure, there are special techniques for efficiently solving linear systems, IPMs enjoy fast convergence and … Read more

New sequential optimality conditions for mathematical problems with complementarity constraints and algorithmic consequences

In recent years, the theoretical convergence of iterative methods for solving nonlinear constrained optimization problems has been addressed using sequential optimality conditions, which are satisfied by minimizers independently of constraint qualifications (CQs). Even though there is a considerable literature devoted to sequential conditions for standard nonlinear optimization, the same is not true for Mathematical Problems … Read more