An adaptive line-search-free multiobjective gradient method and its iteration-complexity analysis

This work introduces an Adaptive Line-Search-Free Multiobjective Gradient (AMG) method for solving smooth multiobjective optimization problems.
The proposed approach automatically adjusts stepsizes based on steepest descent directions, promoting robustness with respect to stepsize choice while maintaining low computational cost. The method is specifically tailored to the multiobjective setting and does not rely on function evaluations, making it well suited for this scenario. The proposed algorithm admits two variants: (i) a conservative variant, in which the stepsize is monotonically decreasing; and (ii) a flexible variant, which allows occasional increases in the stepsize. From a theoretical standpoint, under standard Lipschitz continuity assumptions on the gradients, we establish iteration-complexity bounds for achieving a Pareto critical point for both variants in the nonconvex setting.
In the convex setting, we further derive improved iteration-complexity bounds for the conservative AMG variant.
From a practical standpoint, the numerical experiments demonstrate that the flexible AMG performs favorably compared to the steepest descent method with either a fixed stepsize or Armijo line search.

Article

Download

View PDF