The Adaptive Sampling Gradient Method: Optimizing Smooth Functions with an Inexact Oracle

Consider settings such as stochastic optimization where a smooth objective function $f$ is unknown but can be estimated with an \emph{inexact oracle} such as quasi-Monte Carlo (QMC) or numerical quadrature. The inexact oracle is assumed to yield function estimates having error that decays with increasing oracle effort. For solving such problems, we present the Adaptive … Read more

Two New Weak Constraint Qualifications for Mathematical Programs with Equilibrium Constraints and Applications

We introduce two new weaker Constraint Qualifications (CQs) for Mathematical Programs with Equilibrium (or Complementarity) Constraints, MPEC for short. One of them is a tailored version of the Constant Rank of Subspace Component (CRSC) and the other is a relaxed version of the MPEC-No Nonzero Abnormal Multiplier Constraint Qualification (MPEC-NNAMCQ). Both incorporate the exact set … Read more

Globally Solving a Class of Optimal Power Flow Problems in Radial Networks by Tree Reduction

We devise an algorithm for finding the global optimal solution of the so-called optimal power flow problem (OPF) for a class of power networks with a tree topology, also called radial networks, for which an efficient and reliable algorithm was not previously known. The algorithm we present is called the tree reduction/expansion method, and is … Read more

A simplicial decomposition framework for large scale convex quadratic programming

In this paper, we analyze in depth a simplicial decomposition like algorithmic framework for large scale convex quadratic programming. In particular, we first propose two tailored strategies for handling the master problem. Then, we describe a few techniques for speeding up the solution of the pricing problem. We report extensive numerical experiments on both real … Read more

Vector Transport-Free SVRG with General Retraction for Riemannian Optimization: Complexity Analysis and Practical Implementation

In this paper, we propose a vector transport-free stochastic variance reduced gradient (SVRG) method with general retraction for empirical risk minimization over Riemannian manifold. Existing SVRG methods on manifold usually consider a specific retraction operation, and involve additional computational costs such as parallel transport or vector transport. The vector transport-free SVRG with general retraction we … Read more

A New Use of Douglas-Rachford Splitting and ADMM for Identifying Infeasible, Unbounded, and Pathological Conic Programs

In this paper, we present a method for identifying infeasible, unbounded, and pathological conic programs based on Douglas-Rachford splitting, or equivalently ADMM. When an optimization program is infeasible, unbounded, or pathological, the iterates of Douglas-Rachford splitting diverge.Somewhat surprisingly, such divergent iterates still provide useful information, which our method uses for identification. In addition, for strongly … Read more

Oracle Complexity of Second-Order Methods for Smooth Convex Optimization

Second-order methods, which utilize gradients as well as Hessians to optimize a given function, are of major importance in mathematical optimization. In this work, we study the oracle complexity of such methods, or equivalently, the number of iterations required to optimize a function to a given accuracy. Focusing on smooth and convex functions, we derive … Read more

An Inexact Newton-like conditional gradient method for constrained nonlinear systems

In this paper, we propose an inexact Newton-like conditional gradient method for solving constrained systems of nonlinear equations. The local convergence of the new method as well as results on its rate are established by using a general majorant condition. Two applications of such condition are provided: one is for functions whose the derivative satisfies … Read more

A pattern search and implicit filtering algorithm for solving linearly constrained minimization problems with noisy objective functions

PSIFA -Pattern Search and Implicit Filtering Algorithm- is a derivative-free algorithm that has been designed for linearly constrained problems with noise in the objective function. It combines some elements of the pattern search approach of Lewis and Torczon (2000) with ideas from the method of implicit filtering of Kelley (2011) enhanced with a further analysis … Read more

The New Butterfly Relaxation Methods for Mathematical Program with Complementarity Constraints

We propose a new family of relaxation schemes for mathematical program with complementarity constraints that extends the relaxations of Kadrani, Dussault, Bechakroun from 2009 and the one of Kanzow \& Schwartz from 2011. We discuss the properties of the sequence of relaxed non-linear program as well as stationarity properties of limiting points. A sub-family of … Read more