A matrix-free approach to build band preconditioners for large-scale bound-constrained optimization

We propose a procedure for building symmetric positive definite band preconditioners for large-scale symmetric, possibly indefinite, linear systems, when the coefficient matrix is not explicitly available, but matrix-vector products involving it can be computed. We focus on linear systems arising in Newton-type iterations within matrix-free versions of projected methods for bound-constrained nonlinear optimization. In this … Read more

A Linesearch-based Derivative-free Approach for Nonsmooth Optimization

In this paper, we propose new linesearch-based methods for nonsmooth optimization problems when first-order information on the problem functions is not available. In the first part, we describe a general framework for bound-constrained problems and analyze its convergence towards stationary points, using the Clarke-Jahn directional derivative. In the second part, we consider inequality constrained optimization … Read more

A filter method with unified step computation for nonlinear optimization

We present a filter linesearch method for solving general nonlinear and nonconvex optimization problems. The method is of the filter variety, but uses a robust (always feasible) subproblem based on an exact penalty function to compute a search direction. This contrasts traditional filter methods that use a (separate) restoration phase designed to reduce infeasibility until … Read more

GENERALIZATIONS OF THE DENNIS-MOR\’E THEOREM II

This paper is a continuation of our previous paper were we presented generalizations of the Dennis-Mor\’e theorem to characterize q-superliner convergences of quasi-Newton methods for solving equations and variational inequalities in Banach spaces. Here we prove Dennis-Mor\’e type theorems for inexact quasi-Newton methods applied to variational inequalities in finite dimensions. We first consider variational inequalities … Read more

The proximal-proximal gradient algorithm

We consider the problem of minimizing a convex objective which is the sum of a smooth part, with Lipschitz continuous gradient, and a nonsmooth part. Inspired by various applications, we focus on the case when the nonsmooth part is a composition of a proper closed convex function P and a nonzero affine map, with the … Read more

An example of slow convergence for Newton’s method on a function with globally Lipschitz continuous Hessian

An example is presented where Newton’s method for unconstrained minimization is applied to find an $\epsilon$-approximate first-order critical point of a smooth function and takes a multiple of $\epsilon^{-2}$ iterations and function evaluations to terminate, which is as many as the steepest-descent method in its worst-case. The novel feature of the proposed example is that … Read more

Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization

In this paper we analyze several new methods for solving nonconvex optimization problems with the objective function formed as a sum of two terms: one is nonconvex and smooth, and another is convex but simple and its structure is known. Further, we consider both cases: unconstrained and linearly constrained nonconvex problems. For optimization problems of … Read more

Solution of Nonlinear Equations via Optimization

This paper presents four optimization models for solving nonlinear equation systems. The models accommodate both over-specified and under-specified systems. A variable endogenization technique that improves efficiency is introduced, and a basic comparative study shows one of the methods presented to be very effective. Citation Siwale, I. (2013). Solution of nonlinear equation systems via optimization. Technical … Read more

Incremental Accelerated Gradient Methods for SVM Classification: Study of the Constrained Approach

We investigate constrained first order techniques for training Support Vector Machines (SVM) for online classification tasks. The methods exploit the structure of the SVM training problem and combine ideas of incremental gradient technique, gradient acceleration and successive simple calculations of Lagrange multipliers. Both primal and dual formulations are studied and compared. Experiments show that the … Read more

An Inexact Sequential Quadratic Optimization Algorithm for Nonlinear Optimization

We propose a sequential quadratic optimization method for solving nonlinear optimization problems with equality and inequality constraints. The novel feature of the algorithm is that, during each iteration, the primal-dual search direction is allowed to be an inexact solution of a given quadratic optimization subproblem. We present a set of generic, loose conditions that the … Read more