Product Assortment Competition with the Decoy Effect

The fraction of customers who choose a particular item from among a set of available items can be increased significantly by the inclusion of a related inferior (and apparently irrelevant) item in the choice set. This violation of the independence from irrelevant alternatives and the regularity properties is called the decoy effect, dominance effect, or … Read more

On Augmentation Algorithms for Linear and Integer-Linear Programming: From Edmonds-Karp to Bland and Beyond

Motivated by Bland’s linear-programming generalization of the renowned Edmonds-Karp efficient refinement of the Ford-Fulkerson maximum-flow algorithm, we discuss three closely-related natural augmentation rules for linear and integer-linear optimization. In several nice situations, we show that polynomially-many augmentation steps suffice to reach an optimum. In particular, when using “discrete steepest-descent augmentations” (i.e., directions with the best … Read more

Quadratic regularization projected alternating Barzilai–Borwein method for constrained optimization

In this paper, based on the regularization techniques and projected gradient strategies, we present a quadratic regularization projected alternating Barzilai–Borwein (QRPABB) method for minimizing differentiable functions on closed convex sets. We show the convergence of the QRPABB method to a constrained stationary point for a nonmonotone line search. When the objective function is convex, we … Read more

Fast Projection onto the Simplex and the l1 Ball

A new algorithm is proposed to project, exactly and in finite time, a vector of arbitrary size onto a simplex or a l1-norm ball. The algorithm is demonstrated to be faster than existing methods. In addition, a wrong statement in a paper by Duchi et al. is corrected and an adversary sequence for Michelot’s algorithm … Read more

Preconditioning of Active-Set Newton Methods for PDE-constrained Optimal Control Problems

We address the problem of preconditioning a sequence of saddle point linear systems arising in the solution of PDE-constrained optimal control problems via active-set Newton methods, with control and (regularized) state constraints. We present two new preconditioners based on a full block matrix factorization of the Schur complement of the Jacobian matrices, where the active-set … Read more

On Second Order Optimality Conditions in Nonlinear Optimization

In this work we present new weak conditions that ensure the validity of necessary second order optimality conditions (SOC) for nonlinear optimization. We are able to prove that weak and strong SOCs hold for all Lagrange multipliers using Abadie-type assumptions. We also prove weak and strong SOCs for at least one Lagrange multiplier imposing the … Read more

E. Lieb convexity inequalities and noncommutative Bernstein inequality in Jordan-algebraic setting

We describe a Jordan-algebraic version of E. Lieb convexity inequalities. A joint convexity of Jordan-algebraic version of quantum entropy is proven. SA spectral theory on semi-simple complex Jordan algebras is used as atool to prove the convexity results. Possible applications to optimization and statistics are indicated CitationPreprint, University of Notre Dame, August 2014ArticleDownload View PDF

A tight iteration-complexity upper bound for the MTY predictor-corrector algorithm via redundant Klee-Minty cubes

It is an open question whether there is an interior-point algorithm for linear optimization problems with a lower iteration-complexity than the classical bound $\mathcal{O}(\sqrt{n} \log(\frac{\mu_1}{\mu_0}))$. This paper provides a negative answer to that question for a variant of the Mizuno-Todd-Ye predictor-corrector algorithm. In fact, we prove that for any $\epsilon >0$, there is a redundant … Read more

Fast Algorithms for the Minimum Volume Estimator

The MVE estimator is an important tool in robust regression and outlier detection in statistics. We develop fast and efficient algorithms for the MVE estimator problem and discuss how they can be implemented efficiently. The novelty of our approach stems from the recent developments in the first-order algorithms for solving the related Minimum Volume Enclosing … Read more

Block stochastic gradient iteration for convex and nonconvex optimization

The stochastic gradient (SG) method can minimize an objective function composed of a large number of differentiable functions, or solve a stochastic optimization problem, to a moderate accuracy. The block coordinate descent/update (BCD) method, on the other hand, handles problems with multiple blocks of variables by updating them one at a time; when the blocks … Read more