A Different Perspective on the Stochastic Convex Feasibility Problem

We analyze a simple randomized subgradient method for approximating solutions to stochastic systems of convex functional constraints, the only input to the algorithm being the size of minibatches. By introducing a new notion of what is meant for a point to approximately solve the constraints, determining bounds on the expected number of iterations reduces to … Read more

A Simple Nearly-Optimal Restart Scheme For Speeding-Up First-Order Methods

We present a simple scheme for restarting first-order methods for convex optimization problems. Restarts are made based only on achieving specified decreases in objective values, the specified amounts being the same for all optimization problems. Unlike existing restart schemes, the scheme makes no attempt to learn parameter values characterizing the structure of an optimization problem, … Read more

Efficient Subgradient Methods for General Convex Optimization

A subgradient method is presented for solving general convex optimization problems, the main requirement being that a strictly-feasible point is known. A feasible sequence of iterates is generated, which converges to within user-specified error of optimality. Feasibility is maintained with a line-search at each iteration, avoiding the need for orthogonal projections onto the feasible region … Read more

Accelerated First-Order Methods for Hyperbolic Programming

A framework is developed for applying accelerated methods to general hyperbolic programming, including linear, second-order cone, and semidefinite programming as special cases. The approach replaces a hyperbolic program with a convex optimization problem whose smooth objective function is explicit, and for which the only constraints are linear equations (one more linear equation than for the … Read more

A Framework for Applying Subgradient Methods to Conic Optimization Problems (version 2)

A framework is presented whereby a general convex conic optimization problem is transformed into an equivalent convex optimization problem whose only constraints are linear equations and whose objective function is Lipschitz continuous. Virtually any subgradient method can be applied to solve the equivalent problem. Two methods are analyzed. (In version 2, the development of algorithms … Read more

A Polynomial-Time Affine-Scaling Method for Semidefinite and Hyperbolic Programming

We develop a natural variant of Dikin’s affine-scaling method, first for semidefinite programming and then for hyperbolic programming in general. We match the best complexity bounds known for interior-point methods. All previous polynomial-time affine-scaling algorithms have been for conic optimization problems in which the underlying cone is symmetric. Hyperbolicity cones, however, need not be symmetric. … Read more

Efficient First-Order Methods for Linear Programming and Semidefinite Programming

We present a simple transformation of any linear program or semidefinite program into an equivalent convex optimization problem whose only constraints are linear equations. The objective function is defined on the whole space, making virtually all subgradient methods be immediately applicable. We observe, moreover, that the objective function is naturally “smoothed,” thereby allowing most first-order … Read more

Central Swaths (A Generalization of the Central Path)

We develop a natural generalization to the notion of the central path — a notion that lies at the heart of interior-point methods for convex optimization. The generalization is accomplished via the “derivative cones” of a “hyperbolicity cone,” the derivatives being direct and mathematically-appealing relaxations of the underlying (hyperbolic) conic constraint, be it the non-negative … Read more

Hyperbolic Programs, and Their Derivative Relaxations

We study the algebraic and facial structures of hyperbolic programs, and examine natural relaxations of hyperbolic programs, the relaxations themselves being hyperbolic programs. Citation TR 1406, School of Operations Research, Cornell University, Ithaca, NY 14853, U.S., 3/04 Article Download View Hyperbolic Programs, and Their Derivative Relaxations