A Redistributed Proximal Bundle Method for Nonconvex Optimization

Proximal bundle methods have been shown to be highly successful optimization methods for unconstrained convex problems with discontinuous first derivatives. This naturally leads to the question of whether proximal variants of bundle methods can be extended to a nonconvex setting. This work proposes an approach based on generating cutting-planes models, not of the objective function … Read more

A quasisecant method for minimizing nonsmooth functions

In this paper a new algorithm to locally minimize nonsmooth, nonconvex functions is developed. We introduce the notion of secants and quasisecants for nonsmooth functions. The quasisecants are applied to find descent directions of locally Lipschitz functions. We design a minimization algorithm which uses quasisecants to find descent directions. We prove that this algorithm converges … Read more

Fejer processes with diminishing disturbances and decomposition of constrained nondifferentiable optimization problems

Iterative processes based on Fejer mappings with diminishing problem-specific shifts in the arguments are considered. Such structure allows fine-tuning of Fejer processes by directing them toward selected subsets of attracting sets. Use of various Fejer operators provides ample opportunities for decomposition and parallel computations. Subgradient projection algorithms with sequential and simultaneous projections on segmented constraints … Read more

A second derivative SQP method: local convergence

Gould and Robinson (NAR 08/18, Oxford University Computing Laboratory, 2008) gave global convergence results for a second-derivative SQP method for minimizing the exact $\ell_1$-merit function for a \emph{fixed} value of the penalty parameter. To establish this result, we used the properties of the so-called Cauchy step, which was itself computed from the so-called predictor step. … Read more

Incremental-like Bundle Methods with Application to Energy Planning

An important field of application of non-smooth optimization refers to decomposition of large-scale or complex problems by Lagrangian duality. In this setting, the dual problem consists in maximizing a concave non-smooth function that is defined as the sum of sub-functions. The evaluation of each sub-function requires solving a specific optimization sub-problem, with specific computational complexity. … Read more

A second derivative SQP method: theoretical issues

Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exact-Hessian SQP methods. In particular, the resulting quadratic programming (QP) subproblems are often nonconvex, and thus finding their global solutions may be … Read more

Impulsive Optimal Control of Hybrid Finite-Dimensional Lagrangian Systems

The scope of this dissertation addresses numerical and theoretical issues in the impulsive control of hybrid finite-dimensional Lagrangian systems. In order to treat these aspects, a modeling framework is presented based on the measure-differential inclusion representation of the Lagrangian dynamics. The main advantage of this representation is that it enables the incorporation of set-valued force … Read more

A SECOND DERIVATIVE SQP METHOD WITH IMPOSED DESCENT

Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exact-Hessian SQP methods. In particular, the resulting quadratic programming (QP) subproblems are often nonconvex, and thus finding their global solutions may be … Read more

Parallel Space Decomposition of the Mesh Adaptive Direct Search algorithm

This paper describes a parallel space decomposition PSD technique for the mesh adaptive direct search MADS algorithm. MADS extends a generalized pattern search for constrained nonsmooth optimization problems. The objective of the present work is to obtain good solutions to larger problems than the ones typically solved by MADS. The new method PSD-MADS is an … Read more

Primal interior point method for minimization of generalized minimax functions

In this report, we propose a primal interior-point method for large sparse generalized minimax optimization. After a short introduction, where the problem is stated, we introduce the basic equations of the Newton method applied to the KKT conditions and propose a primal interior-point method. Next we describe the basic algorithm and give more details concerning … Read more