An Investigation of Newton-Sketch and Subsampled Newton Methods

Sketching, a dimensionality reduction technique, has received much attention in the statistics community. In this paper, we study sketching in the context of Newton’s method for solving finite-sum optimization problems in which the number of variables and data points are both large. We study two forms of sketching that perform dimensionality reduction in data space: … Read more

Polynomial Norms

In this paper, we study polynomial norms, i.e. norms that are the dth root of a degree-d homogeneous polynomial f. We first show that a necessary and sufficient condition for f^(1/d) to be a norm is for f to be strictly convex, or equivalently, convex and positive definite. Though not all norms come from dth … Read more

The Many Faces of Degeneracy in Conic Optimization

Slater’s condition — existence of a “strictly feasible solution” — is a common assumption in conic optimization. Without strict feasibility, first-order optimality conditions may be meaningless, the dual problem may yield little information about the primal, and small changes in the data may render the problem infeasible. Hence, failure of strict feasibility can negatively impact … Read more

Subdifferentiation and Smoothing of Nonsmooth Integral Functionals

The subdifferential calculus for the expectation of nonsmooth random integrands involves many fundamental and challenging problems in stochastic optimization. It is known that for Clarke regular integrands, the Clarke subdifferential equals the expectation of their Clarke subdifferential. In particular, this holds for convex integrands. However, little is known about calculation of Clarke subgradients for the … Read more

On the pointwise iteration-complexity of a dynamic regularized ADMM with over-relaxation stepsize

In this paper, we extend the improved pointwise iteration-complexity result of a dynamic regularized alternating direction method of multipliers (ADMM) for a new stepsize domain. In this complexity analysis, the stepsize parameter can even be chosen in the interval $(0,2)$ instead of interval $(0,(1+\sqrt{5})/2)$. As usual, our analysis is established by interpreting this ADMM variant … Read more

The symmetric ADMM with positive-indefinite proximal regularization and its application

Due to update the Lagrangian multiplier twice at each iteration, the symmetric alternating direction method of multipliers (S-ADMM) often performs better than other ADMM-type methods. In practice, some proximal terms with positive definite proximal matrices are often added to its subproblems, and it is commonly known that large proximal parameter of the proximal term often … Read more

Local Linear Convergence Analysis of Primal–Dual Splitting Methods

In this paper, we study the local linear convergence properties of a versatile class of Primal–Dual splitting methods for minimizing composite non-smooth convex optimization problems. Under the assumption that the non-smooth components of the problem are partly smooth relative to smooth manifolds, we present a unified local convergence analysis framework for these Primal–Dual splitting methods. … Read more

ADMM for monotone operators: convergence analysis and rates

We propose in this paper a unifying scheme for several algorithms from the literature dedicated to the solving of monotone inclusion problems involving compositions with linear continuous operators in infinite dimensional Hilbert spaces. We show that a number of primal-dual algorithms for monotone inclusions and also the classical ADMM numerical scheme for convex optimization problems, … Read more

A symmetric version of the generalized alternating direction method of multipliers for two-block separable convex programming

\ys{This paper introduces} a symmetric version of the generalized alternating direction method of multipliers for two-block separable convex programming \ys{with linear equality constraints, which inherits the superiorities of the classical alternating direction method of multipliers (ADMM), and extends the feasible set of the relaxation factor $\alpha$ of the generalized ADMM to the infinite interval $[1,+\infty)$}. … Read more

A Note on the Forward-Douglas–Rachford Splitting for Monotone Inclusion and Convex Optimization

We shed light on the structure of the “three-operator” version of the forward-Douglas–Rachford splitting algorithm for finding a zero of a sum of maximally monotone operators $A + B + C$, where $B$ is cocoercive, involving only the computation of $B$ and of the resolvent of $A$ and of $C$, separately. We show that it … Read more