Nonlinear Derivative-free Constrained Optimization with a Mixed Penalty-Logarithmic Barrier Approach and Direct Search

In this work, we propose the joint use of a mixed penalty-logarithmic barrier approach and generating set search, for addressing nonlinearly constrained derivative-free optimization problems. A merit function is considered, wherein the set of inequality constraints is divided into two groups: one treated with a logarithmic barrier approach, and another, along with the equality constraints, … Read more

Local Convergence Analysis for Nonisolated Solutions to Derivative-Free Methods of Optimization

This paper provides a local convergence analysis for newly developed derivative-free methods in problems of smooth nonconvex optimization. We focus here on local convergence to local minimizers, which might be nonisolated and hence more challenging for convergence analysis. The main results provide efficient conditions for local convergence to arbitrary local minimizers under the fulfillment of … Read more

An exact method for a class of robust nonlinear optimization problems

We introduce a novel exact approach for addressing a broad spectrum of optimization problems with robust nonlinear constraints. These constraints are defined as sums of products of linear times concave (SLC) functions with respect to the uncertain parameters. Our approach synergizes a cutting set method with reformulation-perspectification techniques and branch and bound. We further extend … Read more

Block cubic Newton with greedy selection

A second-order block coordinate descent method is proposed for the unconstrained minimization of an objective function with a Lipschitz continuous Hessian. At each iteration, a block of variables is selected by means of a greedy (Gauss-Southwell) rule which considers the amount of first-order stationarity violation, then an approximate minimizer of a cubic model is computed … Read more

An Adaptive Proximal ADMM for Nonconvex Linearly Constrained Composite Programs

This paper develops an adaptive proximal alternating direction method of multipliers (ADMM) for solving linearly constrained, composite optimization problems under the assumption that the smooth component of the objective is weakly convex, while the non-smooth component is convex and block-separable.  The proposed method is adaptive to all problem parameters, including smoothness and weak convexity constants, … Read more

Black-box Optimization Algorithms for Regularized Least-squares Problems

We consider the problem of optimizing the sum of a smooth, nonconvex function for which derivatives are unavailable, and a convex, nonsmooth function with easy-to-evaluate proximal operator. Of particular focus is the case where the smooth part has a nonlinear least-squares structure. We adapt two existing approaches for derivative-free optimization of nonsmooth compositions of smooth … Read more

On the strength of Burer’s lifted convex relaxation to quadratic programming with ball constraints

We study quadratic programs with m ball constraints, and the strength of a lifted convex relaxation for it recently proposed by Burer (2024). Burer shows this relaxation is exact when m=2. For general m, Burer (2024) provides numerical evidence that this lifted relaxation is tighter than the Kronecker product based Reformulation Linearization Technique (RLT) inequalities … Read more

Regularized Gradient Clipping Provably Trains Wide and Deep Neural Networks

In this work, we instantiate a regularized form of the gradient clipping algorithm and prove that it can converge to the global minima of deep neural network loss functions provided that the net is of sufficient width. We present empirical evidence that our theoretically founded regularized gradient clipping algorithm is also competitive with the state-of-the-art … Read more

Predictive Low Rank Matrix Learning under Partial Observations: Mixed-Projection ADMM

We study the problem of learning a partially observed matrix under the low rank assumption in the presence of fully observed side information that depends linearly on the true underlying matrix. This problem consists of an important generalization of the Matrix Completion problem, a central problem in Statistics, Operations Research and Machine Learning, that arises … Read more