Iteration-complexity of block-decomposition algorithms and the alternating minimization augmented Lagrangian method

In this paper, we consider the monotone inclusion problem consisting of the sum of a continuous monotone map and a point-to-set maximal monotone operator with a separable two-block structure and introduce a framework of block-decomposition prox-type algorithms for solving it which allows for each one of the single-block proximal subproblems to be solved in an … Read more

Approximating Stationary Points of Stochastic Mathematical Programs with Equilibrium Constraints via Sample Averaging

We investigate sample average approximation of a general class of one-stage stochastic mathematical programs with equilibrium constraints. By using graphical convergence of unbounded set-valued mappings, we demonstrate almost sure convergence of a sequence of stationary points of sample average approximation problems to their true counterparts as the sample size increases. In particular we show the … Read more

Accelerated Block-Coordinate Relaxation for Regularized Optimization

We discuss minimization of a smooth function to which is added a separable regularization function that induces structure in the solution. A block-coordinate relaxation approach with proximal linearized subproblems yields convergence to critical points, while identification of the optimal manifold (under a nondegeneracy condition) allows acceleration techniques to be applied on a reduced space. The … Read more

Solving Infinite-dimensional Optimization Problems by Polynomial Approximation

We solve a class of convex infinite-dimensional optimization problems using a numerical approximation method that does not rely on discretization. Instead, we restrict the decision variable to a sequence of finite-dimensional linear subspaces of the original infinite-dimensional space and solve the corresponding finite-dimensional problems in a efficient way using structured convex optimization techniques. We prove … Read more

Double smoothing technique for infinite-dimensional optimization problems with applications to Optimal Control.

In this paper, we propose an efficient technique for solving some infinite-dimensional problems over the sets of functions of time. In our problem, besides the convex point-wise constraints on state variables, we have convex coupling constraints with finite-dimensional image. Hence, we can formulate a finite-dimensional dual problem, which can be solved by efficient gradient methods. … Read more

Optimality conditions for various efficient solutions involving coderivatives: from set-valued optimization problems to set-valued equilibrium problems

We present a new approach to the study of a set-valued equilibrium problem (for short, SEP) through the study of a set-valued optimization problem with a geometric constraint (for short, SOP) based on an equivalence between solutions of these problems. As illustrations, we adapt to SEP enhanced notions of relative Pareto efficient solutions introduced in … Read more

A Practical Relative Error Criterion for Augmented Lagrangians

This paper develops a new error criterion for the approximate minimization of augmented Lagrangian subproblems. This criterion is practical in the sense that it requires only information that is ordinarily readily available, such as the gradient (or a subgradient) of the augmented Lagrangian. It is also “relative” in the sense of relative error criteria for … Read more

Complexity of variants of Tseng’s modified F-B splitting and Korpelevich’s methods for generalized variational inequalities with applications to saddle point and convex optimization problems

In this paper, we consider both a variant of Tseng’s modified forward-backward splitting method and an extension of Korpelevich’s method for solving generalized variational inequalities with Lipschitz continuous operators. By showing that these methods are special cases of the hybrid proximal extragradient (HPE) method introduced by Solodov and Svaiter, we derive iteration-complexity bounds for them … Read more

Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: a Generic Algorithmic Framework

In this paper we present a generic algorithmic framework, namely, the accelerated stochastic approximation (AC-SA) algorithm, for solving strongly convex stochastic composite optimization (SCO) problems. While the classical stochastic approximation (SA) algorithms are asymptotically optimal for solving differentiable and strongly convex problems, the AC-SA algorithm, when employed with proper stepsize policies, can achieve optimal or … Read more