String-Averaging Projected Subgradient Methods for Constrained Minimization

We consider constrained minimization problems and propose to replace the projection onto the entire feasible region, required in the Projected Subgradient Method (PSM), by projections onto the individual sets whose intersection forms the entire feasible region. Specifically, we propose to perform such projections onto the individual sets in an algorithmic regime of a feasibility-seeking iterative … Read more

Composite Self-concordant Minimization

We propose a variable metric framework for minimizing the sum of a self-concordant function and a possibly non-smooth convex function endowed with a computable proximal operator. We theoretically establish the convergence of our framework without relying on the usual Lipschitz gradient assumption on the smooth part. An important highlight of our work is a new … Read more

Inverse Parametric Optimization with an Application to Hybrid System Control

We present a number of results on inverse parametric optimization and its application to hybrid system control. We show that any function that can be written as the difference of two convex functions can also be written as a linear mapping of the solution to a convex parametric optimization problem. We exploit these results in … Read more

Multiperiod Portfolio Optimization with General Transaction Costs

We analyze the properties of the optimal portfolio policy for a multiperiod mean-variance investor facing multiple risky assets in the presence of general transaction costs such as proportional, market impact, and quadratic transaction costs. For proportional transaction costs, we find that a buy-and-hold policy is optimal: if the starting portfolio is outside a parallelogram-shaped no-trade … Read more

On Lower Complexity Bounds for Large-Scale Smooth Convex Optimization

In this note we present tight lower bounds on the information-based complexity of large-scale smooth convex minimization problems. We demonstrate, in particular, that the k-step Conditional Gradient (a.k.a. Frank-Wolfe) algorithm as applied to minimizing smooth convex functions over the n-dimensional box with n ≥ k is optimal, up to an O(ln n)-factor, in terms of … Read more

Convex relaxation for finding planted influential nodes in a social network

We consider the problem of maximizing influence in a social network. We focus on the case that the social network is a directed bipartite graph whose arcs join senders to receivers. We consider both the case of deterministic networks and probabilistic graphical models, that is, the so-called “cascade” model. The problem is to find the … Read more

A Generalized Proximal Point Algorithm and its Convergence Rate

We propose a generalized proximal point algorithm (PPA), in the generic setting of finding a zero point of a maximal monotone operator. In addition to the classical PPA, a number of benchmark operator splitting methods in PDE and optimization literatures such as the Douglas-Rachford splitting method, Peaceman-Rachford splitting method, alternating direction method of multipliers, generalized … Read more

New Analysis and Results for the Conditional Gradient Method

We present new results for the conditional gradient method (also known as the Frank-Wolfe method). We derive computational guarantees for arbitrary step-size sequences, which are then applied to various step-size rules, including simple averaging and constant step-sizes. We also develop step-size rules and computational guarantees that depend naturally on the warm-start quality of the initial … Read more

A Deterministic Rescaled Perceptron Algorithm

The perceptron algorithm is a simple iterative procedure for finding a point in a convex cone $F$. At each iteration, the algorithm only involves a query of a separation oracle for $F$ and a simple update on a trial solution. The perceptron algorithm is guaranteed to find a feasible point in $F$ after $\Oh(1/\tau_F^2)$ iterations, … Read more

Some preconditioners for systems of linear inequalities

We show that a combination of two simple preprocessing steps would generally improve the conditioning of a homogeneous system of linear inequalities. Our approach is based on a comparison among three different but related notions of conditioning for linear inequalities. Article Download View Some preconditioners for systems of linear inequalities