Inertial Block Mirror Descent Method for Non-Convex Non-Smooth Optimization

In this paper, we propose inertial versions of block coordinate descent methods for solving non-convex non-smooth composite optimization problems. We use the general framework of Bregman distance functions to compute the proximal maps. Our method not only allows using two different extrapolation points to evaluate gradients and adding the inertial force, but also takes advantage … Read more

Generalized conditional subgradient and generalized mirror descent: duality, convergence, and symmetry

We provide new insight into a generalized conditional subgradient algorithm and a generalized mirror descent algorithm for the convex minimization problem \[\min_x \; \{f(Ax) + h(x)\}.\] As Bach showed in [SIAM J. Optim., 25 (2015), pp. 115–129], applying either of these two algorithms to this problem is equivalent to applying the other one to its … Read more

Stochastic model-based minimization under high-order growth

Given a nonsmooth, nonconvex minimization problem, we consider algorithms that iteratively sample and minimize stochastic convex models of the objective function. Assuming that the one-sided approximation quality and the variation of the models is controlled by a Bregman divergence, we show that the scheme drives a natural stationarity measure to zero at the rate $O(k^{-1/4})$. … Read more

Relative-Continuity” for Non-Lipschitz Non-Smooth Convex Optimization using Stochastic (or Deterministic) Mirror Descent

The usual approach to developing and analyzing first-order methods for non-smooth (stochastic or deterministic) convex optimization assumes that the objective function is uniformly Lipschitz continuous with parameter $M_f$. However, in many settings the non-differentiable convex function $f(\cdot)$ is not uniformly Lipschitz continuous — for example (i) the classical support vector machine (SVM) problem, (ii) the … Read more

A weighted Mirror Descent algorithm for nonsmooth convex optimization problem

Large scale nonsmooth convex optimization is a common problem for a range of computational areas including machine learning and computer vision. Problems in these areas contain special domain structures and characteristics. Special treatment of such problem domains, exploiting their structures, can significantly improve the the computational burden. We present a weighted Mirror Descent method to … Read more

Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization

In this paper, we present a new stochastic algorithm, namely the stochastic block mirror descent (SBMD) method for solving large-scale nonsmooth and stochastic optimization problems. The basic idea of this algorithm is to incorporate the block-coordinate decomposition and an incremental block averaging scheme into the classic (stochastic) mirror-descent method, in order to significantly reduce the … Read more