Inexact projected gradient method for vector optimization

In this work, we propose an inexact projected gradient-like method for solving smooth constrained vector optimization problems. In the unconstrained case, we retrieve the steepest descent method introduced by Graña Drummond and Svaiter. In the constrained setting, the method we present extends the exact one proposed by Graña Drummond and Iusem, since it admits relative errors on the search directions. At each iteration, a decrease of the objective value is obtained by means of an Armijo-like rule. The convergence results of this new method extend those obtained by Fukuda and Graña Drummond for the exact version. Basically, for antisymmetric and non-antisymmetric partial orders, under some reasonable hypotheses, global convergence to weakly efficient points of all sequences produced by the inexact projected gradient method is established for convex (respect to the ordering cone) objective functions. In the convergence analysis we also establish a connection between the so-called weighting method and the one we propose.

Citation

State University of Campinas and Federal University of Rio de Janeiro, Brazil, June/2011.

Article

Download

View Inexact projected gradient method for vector optimization