A Limited Memory Subspace Minimization Conjugate Gradient Algorithm for Unconstrained Optimization

Subspace minimization conjugate gradient (SMCG) methods are a class of high potential iterative methods for unconstrained optimization. The orthogonality is an important property of linear conjugate gradient method. It is however observed that the orthogonality of gradients in linear conjugate gradient method is often lost, which usually causes the slow convergence of conjugate gradient method. … Read more

Two efficient gradient methods with approximately optimal stepsizes based on regularization models for unconstrained optimization

It is widely accepted that the stepsize is of great significance to gradient method. Two efficient gradient methods with approximately optimal stepsizes mainly based on regularization models are proposed for unconstrained optimization. More exactly, if the objective function is not close to a quadratic function on the line segment between the current and latest iterates, … Read more

On the Convergence Results of a class of Nonmonotone Accelerated Proximal Gradient Methods for Nonsmooth and Nonconvex Minimization Problems

In this paper, we consider a class of nonsmooth problem that is the sum of a Lipschitz differentiable function and a nonsmooth and proper lower semicontinuous function. We discuss here the convergence rate of the function values for a nonmonotone accelerated proximal gradient method, which proposed in “Huan Li and Zhouchen Lin: Accelerated proximal gradient … Read more

A Nonmonontone Accelerated Proximal Gradient Method with Variable Stepsize Strategy for Nonsmooth and Nonconvex Minimization Problems

We propose a new nonmonontone accelerated proximal gradient method with variable stepsize strategy for minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting. In this algorithm, the objective function value be allowed to increase discontinuously, but is decreasing from the overall point of view. The variable stepsize strategy don’t … Read more

Some Modified Fast Iteration Shrinkage Thresholding Algorithms with a New Adaptive Non-monotone Stepsize Strategy for Nonsmooth and Convex Minimization Problems

The ” fast iterative shrinkage-thresholding algorithm ” (FISTA) is one of the most famous first order optimization scheme, and the stepsize, which plays an important role in theoretical analysis and numerical experiment, is always determined by a constant related to the Lipschitz constant or by a backtracking strategy. In this paper, we design a new … Read more

New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization

In this paper, two new subspace minimization conjugate gradient methods based on p-regularization models are proposed, where a special scaled norm in p-regularization model is analyzed. Diffierent choices for special scaled norm lead to different solutions to the p-regularized subproblem. Based on the analyses of the solutions in a two-dimensional subspace, we derive new directions … Read more

Quadratic regularization projected alternating Barzilai–Borwein method for constrained optimization

In this paper, based on the regularization techniques and projected gradient strategies, we present a quadratic regularization projected alternating Barzilai–Borwein (QRPABB) method for minimizing differentiable functions on closed convex sets. We show the convergence of the QRPABB method to a constrained stationary point for a nonmonotone line search. When the objective function is convex, we … Read more