A Generalized Worst-Case Complexity Analysis for Non-Monotone Line Searches

We study the worst-case complexity of a non-monotone line search framework that covers a wide variety of known techniques published in the literature. In this framework, the non-monotonicity is controlled by a sequence of nonnegative parameters. We obtain complexity bounds to achieve approximate first-order optimality even when this sequence is not summable. Article Download View … Read more

On the asymptotic convergence and acceleration of gradient methods

We consider the asymptotic behavior of a family of gradient methods, which include the steepest descent and minimal gradient methods as special instances. It is proved that each method in the family will asymptotically zigzag between two directions. Asymptotic convergence results of the objective value, gradient norm, and stepsize are presented as well. To accelerate … Read more

On Inexact Solution of Auxiliary Problems in Tensor Methods for Convex Optimization

In this paper we study the auxiliary problems that appear in p-order tensor methods for unconstrained minimization of convex functions with \nu-Holder continuous pth derivatives. This type of auxiliary problems corresponds to the minimization of a (p+\nu)-order regularization of the pth order Taylor approximation of the objective. For the case p=3, we consider the use … Read more

Stabilized Barzilai-Borwein method

The Barzilai-Borwein (BB) method is a popular and efficient tool for solving large-scale unconstrained optimization problems. Its search direction is the same as for the steepest descent (Cauchy) method, but its stepsize rule is different. Owing to this, it converges much faster than the Cauchy method. A feature of the BB method is that it … Read more

Tensor Methods for Finding Approximate Stationary Points of Convex Functions

In this paper we consider the problem of finding \epsilon-approximate stationary points of convex functions that are p-times differentiable with \nu-Hölder continuous pth derivatives. We present tensor methods with and without acceleration. Specifically, we show that the non-accelerated schemes take at most O(\epsilon^{-1/(p+\nu-1)}) iterations to reduce the norm of the gradient of the objective below … Read more

Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives

In this paper we study p-order methods for unconstrained minimization of convex functions that are p-times differentiable with $\nu$-Hölder continuous pth derivatives. We propose tensor schemes with and without acceleration. For the schemes without acceleration, we establish iteration complexity bounds of $\mathcal{O}\left(\epsilon^{-1/(p+\nu-1)}\right)$ for reducing the functional residual below a given $\epsilon\in (0,1)$. Assuming that $\nu$ … Read more

A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions

To improve the performance of the limited-memory variable metric L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed in [1, 2]. But the suitable extra updates need to be selected carefully, since the repeating process can be time consuming. We show that for the limited-memory variable metric BNS method, matrix … Read more

Concise Complexity Analyses for Trust-Region Methods

Concise complexity analyses are presented for simple trust region algorithms for solving unconstrained optimization problems. In contrast to a traditional trust region algorithm, the algorithms considered in this paper require certain control over the choice of trust region radius after any successful iteration. The analyses highlight the essential algorithm components required to obtain certain complexity … Read more

On the use of third-order models with fourth-order regularization for unconstrained optimization

In a recent paper, it was shown that, for the smooth unconstrained optimization problem, worst-case evaluation complexity $O(\epsilon^{-(p+1)/p})$ may be obtained by means of algorithms that employ sequential approximate minimizations of p-th order Taylor models plus (p + 1)-th order regularization terms. The aforementioned result, which assumes Lipschitz continuity of the p-th partial derivatives, generalizes … Read more

An Inexact Regularized Newton Framework with a Worst-Case Iteration Complexity of $\mathcal{O}(\epsilon^{-3/2})$ for Nonconvex Optimization

An algorithm for solving smooth nonconvex optimization problems is proposed that, in the worst-case, takes $\mathcal{O}(\epsilon^{-3/2})$ iterations to drive the norm of the gradient of the objective function below a prescribed positive real number $\epsilon$ and can take $\mathcal{O}(\epsilon^{-3})$ iterations to drive the leftmost eigenvalue of the Hessian of the objective above $-\epsilon$. The proposed … Read more