Learning for Spatial Branching: An Algorithm Selection Approach

The use of machine learning techniques to improve the performance of branch-and-bound optimization algorithms is a very active area in the context of mixed integer linear problems, but little has been done for non-linear optimization. To bridge this gap, we develop a learning framework for spatial branching and show its efficacy in the context of … Read more

Discerning the linear convergence of ADMM for structured convex optimization through the lens of variational analysis

Despite the rich literature, the linear convergence of alternating direction method of multipliers (ADMM) has not been fully understood even for the convex case. For example, the linear convergence of ADMM can be empirically observed in a wide range of applications, while existing theoretical results seem to be too stringent to be satisfied or too … Read more

Coalescing Data and Decision Sciences for Analytics

The dream of analytics is to work from common, clean, and consistent data sources in a manner that all of its facets (descriptive, predictive, and prescriptive) are sup- ported via a coherent vision of data and decision sciences. To the extent that data and decisions sciences work within logically/mathematically consistent frameworks, and that these paradigms … Read more

Learning Enabled Optimization: Towards a Fusion of Statistical Learning and Stochastic Optimization

Several emerging applications, such as “Analytics of Things” and “Integrative Analytics” call for a fusion of statistical learning (SL) and stochastic optimization (SO). The Learning Enabled Optimization paradigm fuses concepts from these disciplines in a manner which not only enriches both SL and SO, but also provides a framework which supports rapid model updates and … Read more

Generalized Symmetric ADMM for Separable Convex Optimization

The Alternating Direction Method of Multipliers (ADMM) has been proved to be effective for solving separable convex optimization subject to linear constraints. In this paper, we propose a Generalized Symmetric ADMM (GS-ADMM), which updates the Lagrange multiplier twice with suitable stepsizes, to solve the multi-block separable convex programming. This GS-ADMM partitions the data into two … Read more

Stochastic Compositional Gradient Descent: Algorithms for Minimizing Compositions of Expected-Value Functions

Classical stochastic gradient methods are well suited for minimizing expected-value objective functions. However, they do not apply to the minimization of a nonlinear function involving expected values or a composition of two expected-value functions, i.e., problems of the form $\min_x \E_v\[f_v\big(\E_w [g_w(x)]\big) \]$. In order to solve this stochastic composition problem, we propose a class … Read more

The Direct Extension of ADMM for Multi-block Convex Minimization Problems is Not Necessarily Convergent

The alternating direction method of multipliers (ADMM) is now widely used in many fields, and its convergence was proved when two blocks of variables are alternatively updated. It is strongly desirable and practically valuable to extend ADMM directly to the case of a multi-block convex minimization problem where its objective function is the sum of … Read more