Steering Exact Penalty Methods for Optimization

This paper reviews, extends and analyzes a new class of penalty methods for nonlinear optimization. These methods adjust the penalty parameter dynamically; by controlling the degree of linear feasibility achieved at every iteration, they promote balanced progress toward optimality and feasibility. In contrast with classical approaches, the choice of the penalty parameter ceases to be … Read more

Algorithm xxx: APPSPACK 4.0: Asynchronous Parallel Pattern Search for Derivative-Free Optimization

APPSPACK is software for solving unconstrained and bound constrained optimization problems. It implements an asynchronous parallel pattern search method that has been specifically designed for problems characterized by expensive function evaluations. Using APPSPACK to solve optimization problems has several advantages: No derivative information is needed; the procedure for evaluating the objective function can be executed … Read more

Sums of Random Symmetric Matrices and Applications

Let B_i be deterministic symmetric m\times m matrices, and \xi_i be independent random scalars with zero mean and “of order of one” (e.g., \xi_i are Gaussian with zero mean and unit standard deviation). We are interested in conditions for the “typical norm” of the random matrix S_N = \xi_1B_1+…+\xi_NB_N to be of order of 1. … Read more

Continuous optimization of beamlet intensities for photon and proton radiotherapy

Inverse approaches and, in particular, intensity modulated radiotherapy (IMRT), in combination with the development of new technologies such as multi-leaf collimators (MLCs), have enabled new potentialities of radiotherapy for cancer treatment. The main mathematical tool needed in this connection is numerical optimization. In this article, the variety of continuous optimization approaches, which have been proposed … Read more

A Fully Sparse Implementation of a Primal-Dual Interior-Point Potential Reduction Method for Semidefinite Programming

In this paper, we show a way to exploit sparsity in the problem data in a primal-dual potential reduction method for solving a class of semidefinite programs. When the problem data is sparse, the dual variable is also sparse, but the primal one is not. To avoid working with the dense primal variable, we apply … Read more

Using Sampling and Simplex Derivatives in Pattern Search Methods

Pattern search methods can be made more efficient if past function evaluations are appropriately reused. In this paper we will introduce a number of ways of reusing previous evaluations of the objective function based on the computation of simplex derivatives (e.g., simplex gradients) to improve the efficiency of a pattern search iteration. At each iteration … Read more

Computational experience with an interior point algorithm for large scale contact problems

In this paper we present an interior point method for large scale Signorini elastic contact problems. We study the case of an elastic body in frictionless contact with a rigid foundation. Primal and primal-dual algorithms are developed to solve the quadratic optimization problem arising in the variational formulation. Our computational study confirms the efficiency of … Read more

Magnetic Resonance Tissue Density Estimation using Optimal SSFP Pulse-Sequence Design

In this paper, we formulate a nonlinear, nonconvex semidefinite optimization problem to select the steady-state free precession (SSFP) pulse-sequence design variables which maximize the contrast to noise ratio in tissue segmentation. The method could be applied to other pulse sequence types, arbitrary numbers of tissues, and numbers of images. To solve the problem we use … Read more

Pattern Search Method for Discrete L_1 – Approximation

We propose a pattern search method to solve a classical nonsmooth optimization problem. In a deep analogy with pattern search methods for linear constrained optimization, the set of search directions at each iteration is defined in such a way that it conforms to the local geometry of the set of points of nondifferentiability near the … Read more

Parallel Greedy Randomized Adaptive Search Procedures

A GRASP (Greedy Randomized Adaptive Search Procedure) is a metaheuristic for producing good-quality solutions of combinatorial optimization problems. It is usually implemented with a construction procedure based on a greedy randomized algorithm followed by local search. In this Chapter, we survey parallel implementations of GRASP. We describe simple strategies to implement independent parallel GRASP heuristics … Read more