Convergence of Descent Optimization Algorithms under Polyak-Lojasiewicz-Kurdyka Conditions

This paper develops a comprehensive convergence analysis for generic classes of descent algorithms in nonsmooth and nonconvex optimization under several conditions of the Polyak-Lojasiewicz-Kurdyka (PLK) type. Along other results, we prove the finite termination of generic algorithms under the PLK conditions with lower exponents. Specifications are given to establish new convergence rates for inexact reduced … Read more

Fully Adaptive Zeroth-Order Method for Minimizing Functions with Compressible Gradients

We propose an adaptive zeroth-order method for minimizing differentiable functions with L-Lipschitz continuous gradients. The method is designed to take advantage of the eventual compressibility of the gradient of the objective function, but it does not require knowledge of the approximate sparsity level s or the Lipschitz constant L of the gradient. We show that … Read more

New Nonlinear Conjugate Gradient Methods with Guaranteed Descent for Multi-Objective Optimization

In this article, we present several examples of special nonlinear conjugate gradient directions for nonlinear (non-convex) multi-objective optimization. These directions provide a descent direction for the objectives, independent of the line-search. This way, we can provide an algorithm with simple, Armijo-like backtracking and prove convergence to first-order critical points. In contrast to other popular conjugate … Read more

Stochastic first-order methods with multi-extrapolated momentum for highly smooth unconstrained optimization

In this paper we consider an unconstrained stochastic optimization problem where the objective function exhibits a high order of smoothness. In particular, we propose a stochastic first-order method (SFOM) with multi-extrapolated momentum, in which multiple extrapolations are performed in each iteration, followed by a momentum step based on these extrapolations. We show that our proposed … Read more

Randomized Subspace Derivative-Free Optimization with Quadratic Models and Second-Order Convergence

We consider model-based derivative-free optimization (DFO) for large-scale problems, based on iterative minimization in random subspaces. We provide the first worst-case complexity bound for such methods for convergence to approximate second-order critical points, and show that these bounds have significantly improved dimension dependence compared to standard full-space methods, provided low accuracy solutions are desired and/or … Read more

Exploiting Negative Curvature in Conjunction with Adaptive Sampling: Theoretical Results and a Practical Algorithm

In this paper, we propose algorithms that exploit negative curvature for solving noisy nonlinear nonconvex unconstrained optimization problems. We consider both deterministic and stochastic inexact settings, and develop two-step algorithms that combine directions of negative curvature and descent directions to update the iterates. Under reasonable assumptions, we prove second-order convergence results and derive complexity guarantees … Read more

Fast Unconstrained Optimization via Hessian Averaging and Adaptive Gradient Sampling Methods

We consider minimizing finite-sum and expectation objective functions via Hessian-averaging based subsampled Newton methods. These methods allow for gradient inexactness and have fixed per-iteration Hessian approximation costs. The recent work (Na et al. 2023) demonstrated that Hessian averaging can be utilized to achieve fast \(\mathcal{O}\left(\sqrt{\frac{\log k}{k}}\right)\) local superlinear convergence for strongly convex functions in high … Read more

Local Convergence Analysis for Nonisolated Solutions to Derivative-Free Methods of Optimization

This paper provides a local convergence analysis for newly developed derivative-free methods in problems of smooth nonconvex optimization. We focus here on local convergence to local minimizers, which might be nonisolated and hence more challenging for convergence analysis. The main results provide efficient conditions for local convergence to arbitrary local minimizers under the fulfillment of … Read more

Black-box Optimization Algorithms for Regularized Least-squares Problems

We consider the problem of optimizing the sum of a smooth, nonconvex function for which derivatives are unavailable, and a convex, nonsmooth function with easy-to-evaluate proximal operator. Of particular focus is the case where the smooth part has a nonlinear least-squares structure. We adapt two existing approaches for derivative-free optimization of nonsmooth compositions of smooth … Read more

Globally Convergent Derivative-Free Methods in Nonconvex Optimization with and without Noise

This paper addresses the study of nonconvex derivative-free optimization problems, where only information of either smooth objective functions or their noisy approximations is available. General derivative-free methods are proposed for minimizing differentiable (not necessarily convex) functions with globally Lipschitz continuous gradients, where the accuracy of approximate gradients is interacting with stepsizes and exact gradient values. … Read more