An Accelerated Minimal Gradient Method with Momentum for Convex Quadratic Optimization

In this article we address the problem of minimizing a strictly convex quadratic function using a novel iterative method. The new algorithm is based on the well–known Nesterov’s accelerated gradient method. At each iteration of our scheme, the new point is computed by performing a line–search scheme using a search direction given by a linear … Read more

An optimal control theory for accelerated optimization

Accelerated optimization algorithms can be generated using a double-integrator model for the search dynamics imbedded in an optimal control problem. Citation unpublished Article Download View An optimal control theory for accelerated optimization

Nonsmooth Algorithms and Nesterov’s Smoothing Techniques for Generalized Fermat-Torricelli Problems

In this paper we present some algorithms for solving a number of new models of facility location involving sets which generalize the classical Fermat-Torricelli problem. Our approach uses subgradient-type algorithms to cope with nondi erentiabilty of the distance functions therein. Another approach involves approximating nonsmooth optimization problems by smooth optimizations problems using Nesterov’s smoothing techniques. Convergence … Read more