An Efficient Retraction Mapping for the Symplectic Stiefel Manifold

This article introduces a new retraction on the symplectic Stiefel manifold. The operation that requires the highest computational cost to compute the novel retraction is a matrix inversion of size $2p$–by–$2p$, which is much less expensive than those required for the available retractions in the literature. Later, with the new retraction, we design a constraint … Read more

An Accelerated Minimal Gradient Method with Momentum for Convex Quadratic Optimization

In this article we address the problem of minimizing a strictly convex quadratic function using a novel iterative method. The new algorithm is based on the well–known Nesterov’s accelerated gradient method. At each iteration of our scheme, the new point is computed by performing a line–search scheme using a search direction given by a linear … Read more

Two novel gradient methods with optimal step sizes

In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems. The proposed step sizes employ second-order information in order to obtain faster gradient-type methods. Both step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of … Read more

A Hybrid Gradient Method for Strictly Convex Quadratic Programming

In this paper, a reliable hybrid algorithm for solving convex quadratic minimization problems is presented. At each iteration, two points are computed: first, an auxiliary point $\dot{x}_k$ is generated by performing a gradient step equipped with an optimal steplength, then, the next iterate $x_{k+1}$ is obtained through a weighted sum of $\dot{x}_k$ with the penultimate … Read more