Necessary conditions for local optimality in d.c. programming

Using $\eps$-subdifferential calculus for difference-of-convex (d.c.) programming, D\”ur proposed a condition sufficient for local optimality, and showed that this condition is not necessary in general. Here it is proved that whenever the convex part is strongly convex, this condition is also necessary. Strong convexity can always be ensured by changing the given d.c. decomposition slightly. … Read more

A globally convergent primal-dual interior-point filter method for nonlinear programming: new filter optimality measures and computational results

In this paper we modify the original primal-dual interior-point filter method proposed in [18] for the solution of nonlinear programming problems. We introduce two new optimality filter entries based on the objective function, and thus better suited for the purposes of minimization, and propose conditions for using inexact Hessians. We show that the global convergence … Read more

SESOP-TN: Combining Sequential Subspace Optimization with Truncated Newton method

SESOP-TN is a method for very large scale unconstrained optimization of smooth functions. It combines ideas of Sequential Subspace Optimization (SESOP) [Narkiss-Zibulevsky-2005] with those of the Truncated Newton (TN) method . Replacing TN line search with subspace optimization, we allow Conjugate Gradient (CG) iterations to stay matched through consequent TN steps. This resolves the problem … Read more

PSwarm: A Hybrid Solver for Linearly Constrained Global Derivative-Free Optimization

PSwarm was developed originally for the global optimization of functions without derivatives and where the variables are within upper and lower bounds. The underlying algorithm used is a pattern search method, more specifically a coordinate search method, which guarantees convergence to stationary points from arbitrary starting points. In the (optional) search step of coordinate search, … Read more

Implicitely and Densely Discrete Black-Box Optimization Problems

This paper addresses derivative-free optimization problems where the variables lie implicitly in an unknown discrete closed set. The evaluation of the objective function follows a projection onto the discrete set, which is assumed dense rather than sparse. Such a mathematical setting is a rough representation of what is common in many real-life applications where, despite … Read more

A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization: complete results

We propose a new truncated Newton method for large scale unconstrained optimization, where a Conjugate Gradient (CG)-based technique is adopted to solve Newton’s equation. In the current iteration, the Krylov method computes a pair of search directions: the first approximates the Newton step of the quadratic convex model, while the second is a suitable negative … Read more

Quadratic regularizations in an interior-point method for primal block-angular problems

One of the most efficient interior-point methods for some classes of primal block-angular problems solves the normal equations by a combination of Cholesky factorizations and preconditioned conjugate gradient for, respectively, the block and linking constraints. Its efficiency depends on the spectral radius—in [0,1)—of a certain matrix in the definition of the preconditioner. Spectral radius close … Read more

A Subspace Limited Memory BFGS Algorithm For Box Constrained Optimization

In this paper, a subspace limited BFGS algorithm is proposed for bound constrained optimization. The global convergence will be established under some suitable conditions. Numerical results show that this method is more competitive than the normal method does. Article Download View A Subspace Limited Memory BFGS Algorithm For Box Constrained Optimization

Descent heuristics for unconstrained minimization

Semidefinite relaxations often provide excellent starting points for nonconvex problems with multiple local minimizers. This work aims to find a local minimizer within a certain neighborhood of the starting point and with a small objective value. Several approaches are motivated and compared with each other. Citation Report, Mathematisches Institut, Universitaet Duesseldorf, August 2008. Article Download … Read more

Optimal steepest descent algorithms for unconstrained convex problems: fine tuning Nesterov’s method

We modify the first order algorithm for convex programming proposed by Nesterov. The resulting algorithm keeps the optimal complexity obtained by Nesterov with no need of a known Lipschitz constant for the gradient, and performs better in practically all examples in a set of test problems. Citation Technical Report, Federal University of Santa Catarina, 2008. … Read more