On diagonally-relaxed orthogonal projection methods

We propose and study a block-iterative projections method for solving linear equations and/or inequalities. The method allows diagonal component-wise relaxation in conjunction with orthogonal projections onto the individual hyperplanes of the system, and is thus called diagonally-relaxed orthogonal projections (DROP). Diagonal relaxation has proven useful in accelerating the initial convergence of simultaneous and block-iterative projection … Read more

Using Simplex Gradients of Nonsmooth Functions in Direct Search Methods

It has been shown recently that the efficiency of direct search methods that use opportunistic polling in positive spanning directions can be improved significantly by reordering the poll directions according to descent indicators built from simplex gradients. The purpose of this paper is twofold. First, we analyze the properties of simplex gradients of nonsmooth functions … Read more

Global Convergence of General Derivative-Free Trust-Region Algorithms to First and Second Order Critical Points

In this paper we prove global convergence for first and second-order stationarity points of a class of derivative-free trust-region methods for unconstrained optimization. These methods are based on the sequential minimization of linear or quadratic models built from evaluating the objective function at sample sets. The derivative-free models are required to satisfy Taylor-type bounds but, … Read more

PROXIMAL THRESHOLDING ALGORITHM FOR MINIMIZATION OVER ORTHONORMAL BASES

The notion of soft thresholding plays a central role in problems from various areas of applied mathematics, in which the ideal solution is known to possess a sparse decomposition in some orthonormal basis. Using convex-analytical tools, we extend this notion to that of proximal thresholding and investigate its properties, providing in particular several characterizations of … Read more

Variational Problems in Quasi-Newton Methods

It has been known since the early 1970s that the Hessian matrices in quasi-Newton methods can be updated by variational means, in several different ways. The usual formulation of these variational problems uses a coordinate system, and the symmetry of the Hessian matrices are enforced as explicit constraints. As a result, the variational problems seem … Read more

Numerical Experience with a Recursive Trust-Region Method for Multilevel Nonlinear Optimization

We consider an implementation of the recursive multilevel trust-region algorithm proposed by Gratton, Sartenaer, Toint (2004), and provide significant numerical experience on multilevel test problems. A suitable choice of the algorithm’s parameters is identified on these problems, yielding a very satisfactory compromise between reliability and efficiency. The resulting default algorithm is then compared to alternative … Read more

An Extension of the Proximal Point Method for Quasiconvex Minimization

In this paper we propose an extension of the proximal point method to solve minimization problems with quasiconvex objective functions on the Euclidean space and the nonnegative orthant. For the unconstrained minimization problem, assumming that the function is bounded from below and lower semicontinuous we prove that iterations fxkg given by 0 2 b@(f(:)+(k=2)jj:􀀀xk􀀀1jj2)(xk) are … Read more

On the divergence of line search methods

We discuss the convergence of line search methods for minimization. We explain how Newton’s method and the BFGS method can fail even if the restrictions of the objective function to the search lines are strictly convex functions, the level sets of the objective functions are compact, the line searches are exact and the Wolfe conditions … Read more

Perturbed projections and subgradient projections for the multiple-sets split feasibility problem

We study the multiple-sets split feasibility problem that requires to find a point closest to a family of closed convex sets in one space such that its image under a linear transformation will be closest to another family of closed convex sets in the image space. By casting the problem into an equivalent problem in … Read more

Coordinate and Subspace Optimization Methods for Linear Least Squares with Non-Quadratic Regularization

This work addresses the problem of regularized linear least squares (RLS) with non-quadratic separable regularization. Despite being frequently deployed in many applications, the RLS problem is often hard to solve using standard iterative methods. In a recent work [10], a new iterative method called Parallel Coordinate Descent (PCD) was devised. We provide herein a convergence … Read more