Subsampled cubic regularization method for finite-sum minimization

This paper proposes and analyzes  a  subsampled Cubic Regularization Method  (CRM) for solving  finite-sum optimization problems.  The new method uses  random subsampling techniques  to approximate  the  functions, gradients and Hessians in order to reduce the overall computational cost of the CRM. Under suitable hypotheses,  first- and second-order  iteration-complexity bounds and global convergence analyses  are presented. … Read more

A Cubic Regularization of Newton’s Method with Finite-Difference Hessian Approximations

In this paper, we present a version of the Cubic Regularization of Newton’s method for unconstrained nonconvex optimization, in which the Hessian matrices are approximated by forward finite difference Hessians. The regularization parameter of the cubic models and the accuracy of the Hessian approximations are jointly adjusted using a nonmonotone line-search criterion. Assuming that the … Read more

A framework for convex-constrained monotone nonlinear equations and its special cases

This work refers to methods for solving convex-constrained monotone nonlinear equations. We first propose a framework, which is obtained by combining a safeguard strategy on the search directions with a notion of approximate projections. The global convergence of the framework is established under appropriate assumptions and some examples of methods which fall into this framework … Read more

A study of Liu-Storey conjugate gradient methods for vector optimization

This work presents a study of Liu-Storey (LS) nonlinear conjugate gradient (CG) methods to solve vector optimization problems. Three variants of the LS-CG method originally designed to solve single-objective problems are extended to the vector setting. The first algorithm restricts the LS conjugate parameter to be nonnegative and use a sufficiently accurate line search satisfying … Read more

Inexact Variable Metric Method for Convex-Constrained Optimization Problems

This paper is concerned with the inexact variable metric method for solving convex-constrained optimization problems. At each iteration of this method, the search direction is obtained by inexactly minimizing a strictly convex quadratic function over the closed convex feasible set. Here, we propose a new inexactness criterion for the search direction subproblems. Under mild assumptions, … Read more

Globally convergent Newton-type methods for multiobjective optimization

We propose two Newton-type methods for solving (possibly) nonconvex unconstrained multiobjective optimization problems. The first is directly inspired by the Newton method designed to solve convex problems, whereas  the second uses  second-order information of the objective functions with ingredients of the steepest descent method.  One of the key points of our approaches  is to impose … Read more

An inexact version of the symmetric proximal ADMM for solving separable convex optimization

In this paper, we propose and analyze an inexact version of the symmetric proximal alternating direction method of multipliers (ADMM) for solving linearly constrained optimization problems. Basically, the method allows its first subproblem to be solved inexactly in such way that a relative approximate criterion is satisfied. In terms of the iteration number $k$, we … Read more

On Inexact Accelerated Proximal Gradient Methods with Relative Error Rules

One of the most popular and important first-order iterations that provides optimal complexity of the classical proximal gradient method (PGM) is the “Fast Iterative Shrinkage/Thresholding Algorithm” (FISTA). In this paper, two inexact versions of FISTA for minimizing the sum of two convex functions are studied. The proposed schemes inexactly solve their subproblems by using relative … Read more

On the extension of the Hager-Zhang conjugate gradient method for vector optimization

The extension of the Hager-Zhang (HZ) nonlinear conjugate gradient method for vector optimization is discussed in the present research. In the scalar minimization case, this method generates descent directions whenever, for example, the line search satisfies the standard Wolfe conditions. We first show that, in general, the direct extension of the HZ method for vector … Read more

An Inexact Newton-like conditional gradient method for constrained nonlinear systems

In this paper, we propose an inexact Newton-like conditional gradient method for solving constrained systems of nonlinear equations. The local convergence of the new method as well as results on its rate are established by using a general majorant condition. Two applications of such condition are provided: one is for functions whose the derivative satisfies … Read more