AN INEXACT PERTURBED PATH-FOLLOWING METHOD FOR LAGRANGIAN DECOMPOSITION IN LARGE-SCALE SEPARABLE CONVEX OPTIMIZATION

This paper studies an inexact perturbed path-following algorithm in the framework of Lagrangian dual decomposition for solving large-scale separable convex programming problems. Unlike the exact versions considered in the literature, we propose to solve the primal subproblems inexactly up to a given accuracy. This leads to an inexactness of the gradient vector and the Hessian … Read more

A First-Order Smoothing Technique for a Class of Large-Scale Linear Programs

We study a class of linear programming (LP) problems motivated by large-scale machine learning applications. After reformulating the LP as a convex nonsmooth problem, we apply Nesterov’s primal-dual smoothing technique. It turns out that the iteration complexity of the smoothing technique depends on a parameter $\th$ that arises because we need to bound the originally … Read more

A smooth perceptron algorithm

The perceptron algorithm, introduced in the late fifties in the machine learning community, is a simple greedy algorithm for finding a solution to a finite set of linear inequalities. The algorithm’s main advantages are its simplicity and noise tolerance. The algorithm’s main disadvantage is its slow convergence rate. We propose a modified version of the … Read more

A Double Smoothing Technique for Constrained Convex Optimization Problems and Applications to Optimal Control

In this paper, we propose an efficient approach for solving a class of convex optimization problems in Hilbert spaces. Our feasible region is a (possibly infinite-dimensional) simple convex set, i.e. we assume that projections on this set are computationally easy to compute. The problem we consider is the minimization of a convex function over this … Read more

Double smoothing technique for infinite-dimensional optimization problems with applications to Optimal Control.

In this paper, we propose an efficient technique for solving some infinite-dimensional problems over the sets of functions of time. In our problem, besides the convex point-wise constraints on state variables, we have convex coupling constraints with finite-dimensional image. Hence, we can formulate a finite-dimensional dual problem, which can be solved by efficient gradient methods. … Read more

Smoothing Method of Multipliers for Sum-Max Problems

We study nonsmooth unconstrained optimization problem, which includes sum of pairwise maxima of smooth functions. Minimum $l_1$-norm approximation is a particular case of this problem. Combining ideas Lagrange multipliers with smooth approximation of max-type function, we obtain a new kind of nonquadratic augmented Lagrangian. Our approach does not require artificial variables, and preserves sparse structure … Read more