PROXIMAL THRESHOLDING ALGORITHM FOR MINIMIZATION OVER ORTHONORMAL BASES

The notion of soft thresholding plays a central role in problems from various areas of applied mathematics, in which the ideal solution is known to possess a sparse decomposition in some orthonormal basis. Using convex-analytical tools, we extend this notion to that of proximal thresholding and investigate its properties, providing in particular several characterizations of … Read more

Variational Problems in Quasi-Newton Methods

It has been known since the early 1970s that the Hessian matrices in quasi-Newton methods can be updated by variational means, in several different ways. The usual formulation of these variational problems uses a coordinate system, and the symmetry of the Hessian matrices are enforced as explicit constraints. As a result, the variational problems seem … Read more

Numerical Experience with a Recursive Trust-Region Method for Multilevel Nonlinear Optimization

We consider an implementation of the recursive multilevel trust-region algorithm proposed by Gratton, Sartenaer, Toint (2004), and provide significant numerical experience on multilevel test problems. A suitable choice of the algorithm’s parameters is identified on these problems, yielding a very satisfactory compromise between reliability and efficiency. The resulting default algorithm is then compared to alternative … Read more

An Extension of the Proximal Point Method for Quasiconvex Minimization

In this paper we propose an extension of the proximal point method to solve minimization problems with quasiconvex objective functions on the Euclidean space and the nonnegative orthant. For the unconstrained minimization problem, assumming that the function is bounded from below and lower semicontinuous we prove that iterations fxkg given by 0 2 b@(f(:)+(k=2)jj:􀀀xk􀀀1jj2)(xk) are … Read more

On the divergence of line search methods

We discuss the convergence of line search methods for minimization. We explain how Newton’s method and the BFGS method can fail even if the restrictions of the objective function to the search lines are strictly convex functions, the level sets of the objective functions are compact, the line searches are exact and the Wolfe conditions … Read more

Perturbed projections and subgradient projections for the multiple-sets split feasibility problem

We study the multiple-sets split feasibility problem that requires to find a point closest to a family of closed convex sets in one space such that its image under a linear transformation will be closest to another family of closed convex sets in the image space. By casting the problem into an equivalent problem in … Read more

Coordinate and Subspace Optimization Methods for Linear Least Squares with Non-Quadratic Regularization

This work addresses the problem of regularized linear least squares (RLS) with non-quadratic separable regularization. Despite being frequently deployed in many applications, the RLS problem is often hard to solve using standard iterative methods. In a recent work [10], a new iterative method called Parallel Coordinate Descent (PCD) was devised. We provide herein a convergence … Read more

Using Partial Separability of Functions in Generating Set Search Methods for Unconstrained Optimisation

Generating set Search Methods (GSS), a class of derivative-free methods for unconstrained optimisation, are in general robust but converge slowly. It has been shown that the performance of these methods can be enhanced by utilising accumulated information about the objective function as well as a priori knowledge such as partial separability. This paper introduces a … Read more

Second-order convergence properties of trust-region methods using incomplete curvature information, with an application to multigrid optimization

Convergence properties of trust-region methods for unconstrained nonconvex optimization is considered in the case where information on the objective function’s local curvature is incomplete, in the sense that it may be restricted to a fixed set of “test directions” and may not be available at every iteration. It is shown that convergence to local “weak” … Read more

On the behavior of the conjugate-gradient method on ill-conditioned problems

We study the behavior of the conjugate-gradient method for solving a set of linear equations, where the matrix is symmetric and positive definite with one set of eigenvalues that are large and the remaining are small. We characterize the behavior of the residuals associated with the large eigenvalues throughout the iterations, and also characterize the … Read more