A Simple Adaptive Proximal Gradient Method for Nonconvex Optimization

Consider composite nonconvex optimization problems where the objective function consists of a smooth nonconvex term (with Lipschitz-continuous gradient) and a convex (possibly nonsmooth) term. Existing parameter-free methods for such problems often rely on complex multi-loop structures, require line searches, or depend on restrictive assumptions (e.g., bounded iterates). To address these limitations, we introduce a novel … Read more

Steepest descent method using novel adaptive stepsizes for unconstrained nonlinear multiobjective programming

We propose new adaptive strategies to compute stepsizes for the steepest descent method to solve unconstrained nonlinear multiobjective optimization problems without employing any linesearch procedure. The resulting algorithms can be applied to a wide class of nonconvex unconstrained multi-criteria optimization problems satisfying a global Lipschitz continuity condition imposed on the gradients of all objectives. In … Read more