Efficient high-precision dense matrix algebra on parallel architectures for nonlinear discrete optimization

We provide a proof point for the idea that matrix-based algorithms for discrete optimization problems, mainly conceived for proving theoretical efficiency, can be easily and efficiently implemented on massively-parallel architectures by exploiting scalable and efficient parallel implementations of algorithms for ultra high-precision dense linear algebra. We have successfully implemented our algorithm on the Blue Gene/L … Read more

Implementing Algorithms for Signal and Image Reconstruction on Graphical Processing Units

Several highly effective algorithms that have been proposed recently for compressed sensing and image processing applications can be implemented efficiently on commodity graphical processing units (GPUs). The properties of algorithms and application that make for efficient GPU implementation are discussed, and computational results for several algorithms are presented that show large speedups over CPU implementations. … Read more

Incorporating Minimum Frobenius Norm Models in Direct Search

The goal of this paper is to show that the use of minimum Frobenius norm quadratic models can improve the performance of direct-search methods. The approach taken here is to maintain the structure of directional direct-search methods, organized around a search and a poll step, and to use the set of previously evaluated points generated … Read more

Global Optimization of Non-Linear Systems of Equations by Simulating the Flight of a Projectile in the Conformational Space

A new heuristic optimization algorithm is presented based on an analogy with the physical phenomenon of a projectile launched in a conformational space under the influence of a gravitational force. Its implementation simplicity and the option to enhance it with local search methods make it ideal for the optimization of non-linear systems of equations. The … Read more

A globally convergent primal-dual interior-point filter method for nonlinear programming: new filter optimality measures and computational results

In this paper we modify the original primal-dual interior-point filter method proposed in [18] for the solution of nonlinear programming problems. We introduce two new optimality filter entries based on the objective function, and thus better suited for the purposes of minimization, and propose conditions for using inexact Hessians. We show that the global convergence … Read more

Python Optimization Modeling Objects (Pyomo)

We describe Pyomo, an open-source tool for modeling optimization applications in Python. Pyomo can be used to define abstract problems, create concrete problem instances, and solve these instances with standard solvers. Pyomo provides a capability that is commonly associated with algebraic modeling languages like AMPL and GAMS. Pyomo leverages the capabilities of the Coopr software, … Read more

Strange Behaviors of Interior-point Methods for Solving Semidefinite Programming Problems in Polynomial Optimization

We observe that in a simple one-dimensional polynomial optimization problem (POP), the `optimal’ values of semidefinite programming (SDP) relaxation problems reported by the standard SDP solvers converge to the optimal value of the POP, while the true optimal values of SDP relaxation problems are strictly and significantly less than that value. Some pieces of circumstantial … Read more

A Comparison of Software Packages for Verified Linear Programming

Linear programming is arguably one of the most basic forms of optimization. Its theory and algorithms can not only be applied to linear optimization problems but also to relaxations of nonlinear problems and branch-and-bound methods for mixed-integer and global optimization problems. Recent research shows that against intuition bad condition numbers frequently occur in linear programming. … Read more

Parallel Space Decomposition of the Mesh Adaptive Direct Search algorithm

This paper describes a parallel space decomposition PSD technique for the mesh adaptive direct search MADS algorithm. MADS extends a generalized pattern search for constrained nonsmooth optimization problems. The objective of the present work is to obtain good solutions to larger problems than the ones typically solved by MADS. The new method PSD-MADS is an … Read more

Benchmarking Derivative-Free Optimization Algorithms

We propose data profiles as a tool for analyzing the performance of derivative-free optimization solvers when there are constraints on the computational budget. We use performance and data profiles, together with a convergence test that measures the decrease in function value, to analyze the performance of three solvers on sets of smooth, noisy, and piecewise-smooth … Read more