A Quasi-Newton Algorithm for Nonconvex, Nonsmooth Optimization with Global Convergence Guarantees

A line search algorithm for minimizing nonconvex and/or nonsmooth objective functions is presented. The algorithm is a hybrid between a standard Broyden–Fletcher–Goldfarb–Shanno (BFGS) and an adaptive gradient sampling (GS) method. The BFGS strategy is employed because it typically yields fast convergence to the vicinity of a stationary point, and together with the adaptive GS strategy … Read more

A Second-Order Method for Compressed Sensing Problems with Coherent and Redundant Dictionaries

In this paper we are interested in the solution of Compressed Sensing (CS) problems where the signals to be recovered are sparse in coherent and redundant dictionaries. CS problems of this type are convex with non-smooth and non-separable regularization term, therefore a specialized solver is required. We propose a primal-dual Newton Conjugate Gradients (pdNCG) method. … Read more

Generalized Inexact Proximal Algorithms: Habit’s/ Routine’s Formation with Resistance to Change, following Worthwhile Changes

This paper shows how, in a quasi metric space, an inexact proximal algorithm with a generalized perturbation term appears to be a nice tool for Behavioral Sciences (Psychology, Economics, Management, Game theory,…). More precisely, the new perturbation term represents an index of resistance to change, defined as a “curved enough” function of the quasi distance … Read more

On solving symmetric systems of linear equations in an unnormalized Krylov subspace framework

In an unnormalized Krylov subspace framework for solving symmetric systems of linear equations, the orthogonal vectors that are generated by a Lanczos process are not necessarily on the form of gradients. Associating each orthogonal vector with a triple, and using only the three-term recurrences of the triples, we give conditions on whether a symmetric system … Read more

A modified limited-memory BNS method for unconstrained minimization based on the conjugate directions idea

A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization is proposed, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors for better satisfaction of previous quasi-Newton conditions. In comparison with [16], where a similar approach is used, correction vectors from more previous iterations … Read more

A globally convergent trust-region algorithm for unconstrained derivative-free optimization

In this work we explicit a derivative-free trust-region algorithm for unconstrained optimization based on the paper (Computational Optimization and Applications 53: 527-555, 2012) proposed by Powell. The objective function is approximated by quadratic models obtained by polynomial interpolation. The number of points of the interpolation set is fixed. In each iteration only one interpolation point … Read more

A Stochastic Quasi-Newton Method for Large-Scale Optimization

Abstract The question of how to incorporate curvature information in stochastic approximation methods is challenging. The direct application of classical quasi- Newton updating techniques for deterministic optimization leads to noisy curvature estimates that have harmful effects on the robustness of the iteration. In this paper, we propose a stochastic quasi-Newton method that is efficient, robust … Read more

On Efficiently Combining Limited Memory and Trust-Region Techniques

Limited memory quasi-Newton methods and trust-region methods represent two efficient approaches used for solving unconstrained optimization problems. A straightforward combination of them deteriorates the efficiency of the former approach, especially in the case of large-scale problems. For this reason, the limited memory methods are usually combined with a line search. We show how to efficiently … Read more

An efficient gradient method using the Yuan steplength

We propose a new gradient method for quadratic programming, named SDC, which alternates some SD iterates with some gradient iterates that use a constant steplength computed through the Yuan formula. The SDC method exploits the asymptotic spectral behaviour of the Yuan steplength to foster a selective elimination of the components of the gradient along the … Read more

Quasi-Newton updates with weighted secant equations

We provide a formula for variational quasi-Newton updates with multiple weighted secant equations. The derivation of the formula leads to a Sylvester equation in the correction matrix. Examples are given. Citation Report naXys-09-2013, Namur Centre for Complex Systems, Unibersity of Namur, Namur (Belgium) Article Download View Quasi-Newton updates with weighted secant equations