Block Coordinate Proximal Gradient Method for Nonconvex Optimization Problems: Convergence Analysis

We propose a block coordinate proximal gradient method for a composite minimization problem with two nonconvex function components in the objective while only one of them is assumed to be differentiable. Under some per-block Lipschitz-like conditions based on Bregman distance, but without the global Lipschitz continuity of the gradient of the differentiable function, we prove … Read more

A Merit Function Approach for Evolution Strategies

In this paper, we extend a class of globally convergent evolution strategies to handle general constrained optimization problems. The proposed framework handles relaxable constraints using a merit function approach combined with a specific restoration procedure. The unrelaxable constraints in our framework, when present, are treated either by using the extreme barrier function or through a … Read more

Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers

We present DFO-LS, a software package for derivative-free optimization (DFO) for nonlinear Least-Squares (LS) problems, with optional bound constraints. Inspired by the Gauss-Newton method, DFO-LS constructs simplified linear regression models for the residuals. DFO-LS allows flexible initialization for expensive problems, whereby it can begin making progress from as few as two objective evaluations. Numerical results … Read more

Derivative-Free Superiorization With Component-Wise Perturbations

Superiorization reduces, not necessarily minimizes, the value of a target function while seeking constraints-compatibility. This is done by taking a solely feasibility-seeking algorithm, analyzing its perturbations resilience, and proactively perturbing its iterates accordingly to steer them toward a feasible point with reduced value of the target function. When the perturbation steps are computationally efficient, this … Read more

Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods

This paper presents a finite difference quasi-Newton method for the minimization of noisy functions. The method takes advantage of the scalability and power of BFGS updating, and employs an adaptive procedure for choosing the differencing interval h based on the noise estimation techniques of Hamming (2012) and Moré and Wild (2011). This noise estimation procedure … Read more

A Dynamic Penalty Parameter Updating Strategy for Matrix-Free Sequential Quadratic Optimization

This paper focuses on the design of sequential quadratic optimization (commonly known as SQP) methods for solving large-scale nonlinear optimization problems. The most computationally demanding aspect of such an approach is the computation of the search direction during each iteration, for which we consider the use of matrix-free methods. In particular, we develop a method … Read more

On the Complexity of Testing Attainment of the Optimal Value in Nonlinear Optimization

We prove that unless P=NP, there exists no polynomial time (or even pseudo-polynomial time) algorithm that can test whether the optimal value of a nonlinear optimization problem where the objective and constraints are given by low-degree polynomials is attained. If the degrees of these polynomials are fixed, our results along with previously-known “Frank-Wolfe type” theorems … Read more

Solving Quadratic Programs to High Precision using Scaled Iterative Refinement

Quadratic optimization problems (QPs) are ubiquitous, and solution algorithms have matured to a reliable technology. However, the precision of solutions is usually limited due to the underlying floating-point operations. This may cause inconveniences when solutions are used for rigorous reasoning. We contribute on three levels to overcome this issue. First, we present a novel refinement … Read more

A Globally Asymptotically Stable Polynomial Vector Field with Rational Coefficients and no Local Polynomial Lyapunov Function

We give an explicit example of a two-dimensional polynomial vector field of degree seven that has rational coefficients, is globally asymptotically stable, but does not admit an analytic Lyapunov function even locally. Citation Submitted for publication Article Download View A Globally Asymptotically Stable Polynomial Vector Field with Rational Coefficients and no Local Polynomial Lyapunov Function

The Mesh Adaptive Direct Search Algorithm for Granular and Discrete Variables

The mesh adaptive direct search (Mads) algorithm is designed for blackbox optimization problems for which the functions defining the objective and the constraints are typically the outputs of a simulation seen as a blackbox. It is a derivative-free optimization method designed for continuous variables and is supported by a convergence analysis based on the Clarke … Read more