Structured Sparsity via Alternating Direction Methods

We consider a class of sparse learning problems in high dimensional feature space regularized by a structured sparsity-inducing norm which incorporates prior knowledge of the group structure of the features. Such problems often pose a considerable challenge to optimization algorithms due to the non-smoothness and non-separability of the regularization term. In this paper, we focus … Read more

An Infeasible-Point Subgradient Method Using Adaptive Approximate Projections

We propose a new subgradient method for the minimization of convex functions over a convex set. Common subgradient algorithms require an exact projection onto the feasible region in every iteration, which can be efficient only for problems that admit a fast projection. In our method we use inexact adaptive projections requiring to move within a … Read more

A Sparsity Preserving Stochastic Gradient Method for Composite Optimization

We propose new stochastic gradient algorithms for solving convex composite optimization problems. In each iteration, our algorithms utilize a stochastic oracle of the gradient of the smooth component in the objective function. Our algorithms are based on a stochastic version of the estimate sequence technique introduced by Nesterov (Introductory Lectures on Convex Optimization: A Basic … Read more

Monotonicity recovering and accuracy preserving optimization methods for postprocessing finite element solutions

We suggest here a least-change correction to available finite element (FE) solution. This postprocessing procedure is aimed at recovering the monotonicity and some other important properties that may not be exhibited by the FE solution. It is based on solving a monotonic regression problem with some extra constraints. One of them is a linear equality-type … Read more

Interior-Point Algorithms for a Generalization of Linear Programming and Weighted Centering

We consider an extension of ordinary linear programming (LP) that adds weighted logarithmic barrier terms for some variables. The resulting problem generalizes both LP and the problem of finding the weighted analytic center of a polytope. We show that the problem has a dual of the same form and give complexity results for several different … Read more

There is no variational characterization of the cycles in the method of periodic projections

The method of periodic projections consists in iterating projections onto $m$ closed convex subsets of a Hilbert space according to a periodic sweeping strategy. In the presence of $m\geq 3$ sets, a long-standing question going back to the 1960s is whether the limit cycles obtained by such a process can be characterized as the minimizers … Read more

A new, solvable, primal relaxation for convex nonlinear integer programming problems

The paper describes a new primal relaxation (PR) for computing bounds on nonlinear integer programming (NLIP) problems. It is a natural extension to NLIP problems of the geometric interpretation of Lagrangean relaxation presented by Geoffrion (1974) for linear problems, and it is based on the same assumption that some constraints are complicating and are treated … Read more

Convexity Conditions of Kantorovich Function and Related Semi-infinite Linear Matrix Inequalities

The Kantorovich function $(x^TAx)( x^T A^{-1} x)$, where $A$ is a positive definite matrix, is not convex in general. From a matrix or convex analysis point of view, it is interesting to address the question: When is this function convex? In this paper, we prove that the 2-dimensional Kantorovich function is convex if and only … Read more

New Bounds for Restricted Isometry Constants in Low-rank Matrix Recovery

In this paper, we establish new bounds for restricted isometry constants (RIC) in low-rank matrix recovery. Let $\A$ be a linear transformation from $\R^{m \times n}$ into $\R^p$, and $r$ the rank of recovered matrix $X\in \R^{m \times n}$. Our main result is that if the condition on RIC satisfies $\delta_{2r+k}+2(\frac{r}{k})^{1/2}\delta_{\max\{r+\frac{3}{2}k,2k\}}

A Double Smoothing Technique for Constrained Convex Optimization Problems and Applications to Optimal Control

In this paper, we propose an efficient approach for solving a class of convex optimization problems in Hilbert spaces. Our feasible region is a (possibly infinite-dimensional) simple convex set, i.e. we assume that projections on this set are computationally easy to compute. The problem we consider is the minimization of a convex function over this … Read more