Optimization problems with value function objectives

The family of optimization problems with value function objectives includes the minmax programming problem and the bilevel optimization problem. In this paper, we derive necessary optimality conditions for this class of problems. The main focus is on the case where the functions involved are nonsmooth and the constraints are the very general operator constraints. Citation … Read more

Regularized Sequential Quadratic Programming

We present the formulation and analysis of a new sequential quadratic programming (\SQP) method for general nonlinearly constrained optimization. The method pairs a primal-dual generalized augmented Lagrangian merit function with a \emph{flexible} line search to obtain a sequence of improving estimates of the solution. This function is a primal-dual variant of the augmented Lagrangian proposed … Read more

A SIMPLE APPROACH TO OPTIMALITY CONDITIONS IN MINMAX PROGRAMMING

Considering the minmax programming problem, lower and upper subdi erential optimality conditions, in the sense of Mordukhovich, are derived. The approach here, mainly based on the nonsmooth dual objects of Mordukhovich, is completely di erent from that of most of the previous works where generalizations of the alternative theorem of Farkas have been applied. The results obtained … Read more

Higher-Order Confidence Intervals for Stochastic Programming using Bootstrapping

We study the problem of constructing confidence intervals for the optimal value of a stochastic programming problem by using bootstrapping. Bootstrapping is a resampling method used in the statistical inference of unknown parameters for which only a small number of samples can be obtained. One such parameter is the optimal value of a stochastic optimization … Read more

Approximate spectral factorization for design of efficient sub-filter sequences

A well-known approach to the design of computationally efficient filters is to use spectral factorization, i.e. a decomposition of a filter into a sequence of sub-filters. Due to the sparsity of the sub-filters, the typical processing speedup factor is within the range 1-10 in 2D, and for 3D it achieves 10-100. The design of such … Read more

Derivative-free methods for constrained mixed-integer optimization

We consider the problem of minimizing a continuously di erentiable function of several variables subject to simple bound and general nonlinear inequality constraints, where some of the variables are restricted to take integer values. We assume that the rst order derivatives of the objective and constraint functions can be neither calculated nor approximated explicitly. This class … Read more

Decomposition methods based on projected gradient for network equilibrium problems

In this work we consider the symmetric network equilibrium problem formulated as convex minimization problem whose variables are the path flows. In order to take into account the difficulties related to the large dimension of real network problems we adopt a column generation strategy and we employ a gradient projection method within an inexact decomposition … Read more

An FPTAS for Optimizing a Class of Low-Rank Functions Over a Polytope

We present a fully polynomial time approximation scheme (FPTAS) for optimizing a very general class of nonlinear functions of low rank over a polytope. Our approximation scheme relies on constructing an approximate Pareto-optimal front of the linear functions which constitute the given low-rank function. In contrast to existing results in the literature, our approximation scheme … Read more

Adjoint Sensitivity Analysis for Numerical Weather Prediction: Applications to Power Grid Optimization

We present an approach to estimate adjoint sensitivities of economic metrics of relevance in the power grid with respect to physical weather variables using numerical weather prediction models. We demonstrate that this capability can significantly enhance planning and operations. We illustrate the method using a large-scale computational study where we compute sensitivities of the regional … Read more

Constrained Derivative-Free Optimization on Thin Domains

Many derivative-free methods for constrained problems are not efficient for minimizing functions on “thin” domains. Other algorithms, like those based on Augmented Lagrangians, deal with thin constraints using penalty-like strategies. When the constraints are computationally inexpensive but highly nonlinear, these methods spend many potentially expensive objective function evaluations motivated by the difficulties of improving feasibility. … Read more