A Random Block-Coordinate Douglas-Rachford Splitting Method with Low Computational Complexity for Binary Logistic Regression

In this paper, we propose a new optimization algorithm for sparse logistic regression based on a stochastic version of the Douglas Rachford splitting method. Our algorithm sweeps the training set by randomly selecting a mini-batch of data at each iteration, and it allows us to update the variables in a block coordinate manner. Our approach … Read more

A Stochastic Majorize-Minimize Subspace Algorithm for Online Penalized Least Squares Estimation

Stochastic approximation techniques play an important role in solving many problems encountered in machine learning or adaptive signal processing. In these contexts, the statistics of the data are often unknown a priori or their direct computation is too intensive, and they have thus to be estimated online from the observed signals. For batch optimization of … Read more

A Block Coordinate Variable Metric Forward-Backward Algorithm

A number of recent works have emphasized the prominent role played by the Kurdyka-Lojasiewicz inequality for proving the convergence of iterative algorithms solving possibly nonsmooth/nonconvex optimization problems. In this work, we consider the minimization of an objective function satisfying this property, which is a sum of a non necessarily convex differentiable function and a non … Read more

Variable Metric Forward-Backward algorithm for minimizing the sum of a differentiable function and a convex function

We consider the minimization of a function $G$ defined on $R^N$, which is the sum of a (non necessarily convex) differentiable function and a (non necessarily differentiable) convex function. Moreover, we assume that $G$ satisfies the Kurdyka-Lojasiewicz property. Such a problem can be solved with the Forward-Backward algorithm. However, the latter algorithm may suffer from … Read more