Sequential Threshold Control in Descent Splitting Methods for Decomposable Optimization Problems

We suggest a modification of the descent splitting methods for decomposable composite optimization problems, which maintains the basic convergence properties, but enables one to reduce the computational expenses per iteration and to provide computations in a distributed manner. It consists in making coordinate-wise steps together with a special threshold control. Citation Kazan Federal University, Kazan … Read more

On the optimal order of worst case complexity of direct search

The worst case complexity of direct-search methods has been recently analyzed when they use positive spanning sets and impose a sufficient decrease condition to accept new iterates. Assuming that the objective function is smooth, it is now known that such methods require at most O(n^2 epsilon^{-2}) function evaluations to compute a gradient of norm below … Read more

The direct extension of ADMM for three-block separable convex minimization models is convergent when one function is strongly convex

The alternating direction method of multipliers (ADMM) is a benchmark for solving a two-block linearly constrained convex minimization model whose objective function is the sum of two functions without coupled variables. Meanwhile, it is known that the convergence is not guaranteed if the ADMM is directly extended to a multiple-block convex minimization model whose objective … Read more

On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions

In this paper we present a variant of the proximal forward-backward splitting method for solving nonsmooth optimization problems in Hilbert spaces, when the objective function is the sum of two nondifferentiable convex functions. The proposed iteration, which will be call the Proximal Subgradient Splitting Method, extends the classical projected subgradient iteration for important classes of … Read more

Stochastic Compositional Gradient Descent: Algorithms for Minimizing Compositions of Expected-Value Functions

Classical stochastic gradient methods are well suited for minimizing expected-value objective functions. However, they do not apply to the minimization of a nonlinear function involving expected values or a composition of two expected-value functions, i.e., problems of the form $\min_x \E_v\[f_v\big(\E_w [g_w(x)]\big) \]$. In order to solve this stochastic composition problem, we propose a class … Read more

On the Information-Adaptive Variants of the ADMM: an Iteration Complexity Perspective

Designing algorithms for an optimization model often amounts to maintaining a balance between the degree of information to request from the model on the one hand, and the computational speed to expect on the other hand. Naturally, the more information is available, the faster one can expect the algorithm to converge. The popular algorithm of … Read more

Directional H”older metric subregularity and application to tangent cones

In this work, we study directional versions of the H\”olderian/Lipschitzian metric subregularity of multifunctions. Firstly, we establish variational characterizations of the H\”olderian/Lipschitzian directional metric subregularity by means of the strong slopes and next of mixed tangency-coderivative objects . By product, we give second-order conditions for the directional Lipschitzian metric subregularity and for the directional metric … Read more

HIPAD – A Hybrid Interior-Point Alternating Direction algorithm for knowledge-based SVM and feature selection

We consider classification tasks in the regime of scarce labeled training data in high dimensional feature space, where specific expert knowledge is also available. We propose a new hybrid optimization algorithm that solves the elastic-net support vector machine (SVM) through an alternating direction method of multipliers in the first phase, followed by an interior-point method … Read more

Nonlinear local error bounds via a change of metric

In this work, we improve the approach of Corvellec-Motreanu to nonlinear error bounds for lowersemicontinuous functions on complete metric spaces, an approach consisting in reducing the nonlinear case to the linear one through a change of metric. This improvement is basically a technical one, and allows dealing with local error bounds in an appropriate way. … Read more

Conditional Gradient Sliding for Convex Optimization

In this paper, we present a new conditional gradient type method for convex optimization by utilizing a linear optimization (LO) oracle to minimize a series of linear functions over the feasible set. Different from the classic conditional gradient method, the conditional gradient sliding (CGS) algorithm developed herein can skip the computation of gradients from time … Read more