Local Convergence Analysis for Nonisolated Solutions to Derivative-Free Methods of Optimization

This paper provides a local convergence analysis for newly developed derivative-free methods in problems of smooth nonconvex optimization. We focus here on local convergence to local minimizers, which might be nonisolated and hence more challenging for convergence analysis. The main results provide efficient conditions for local convergence to arbitrary local minimizers under the fulfillment of … Read more

Globally Convergent Derivative-Free Methods in Nonconvex Optimization with and without Noise

This paper addresses the study of nonconvex derivative-free optimization problems, where only information of either smooth objective functions or their noisy approximations is available. General derivative-free methods are proposed for minimizing differentiable (not necessarily convex) functions with globally Lipschitz continuous gradients, where the accuracy of approximate gradients is interacting with stepsizes and exact gradient values. … Read more

Inexact reduced gradient methods in nonconvex optimization

This paper proposes and develops new linesearch methods with inexact gradient information for finding stationary points of nonconvex continuously differentiable functions on finite-dimensional spaces. Some abstract convergence results for a broad class of linesearch methods are established. A general scheme for inexact reduced gradient (IRG) methods is proposed, where the errors in the gradient approximation … Read more