A second order dynamical approach with variable damping to nonconvex smooth minimization

We investigate a second order dynamical system with variable damping in connection with the minimization of a nonconvex differentiable function. The dynamical system is formulated in the spirit of the differential equation which models Nesterov’s accelerated convex gradient method. We show that the generated trajectory converges to a critical point, if a regularization of the … Read more

The proximal alternating direction method of multipliers in the nonconvex setting: convergence analysis and rates

We propose two numerical algorithms for minimizing the sum of a smooth function and the composition of a nonsmooth function with a linear operator in the fully nonconvex setting. The iterative schemes are formulated in the spirit of the proximal and, respectively, proximal linearized alternating direction method of multipliers. The proximal terms are introduced through … Read more

Convergence Rates for Deterministic and Stochastic Subgradient Methods Without Lipschitz Continuity

We generalize the classic convergence rate theory for subgradient methods to apply to non-Lipschitz functions via a new measure of steepness. For the deterministic projected subgradient method, we derive a global $O(1/\sqrt{T})$ convergence rate for any function with at most exponential growth. Our approach implies generalizations of the standard convergence rates for gradient descent on … Read more

Convergence rates of accelerated proximal gradient algorithms under independent noise

We consider an accelerated proximal gradient algorithm for the composite optimization with “independent errors” (errors little related with historical information) for solving linear inverse problems. We present a new inexact version of FISTA algorithm considering deterministic and stochastic noises. We prove some convergence rates of the algorithm and we connect it with the current existing … Read more

Generalized ADMM with Optimal Inde nite Proximal Term for Linearly Constrained Convex Optimization

We consider the generalized alternating direction method of multipliers (ADMM) for linearly constrained convex optimization. Many problems derived from practical applications have showed that usually one of the subproblems in the generalized ADMM is hard to solve, thus a special proximal term is added. In the literature, the proximal term can be inde nite which plays … Read more

Optimal Linearized Alternating Direction Method of Multipliers for Convex Programming

The alternating direction method of multipliers (ADMM) is being widely used in a variety of areas; its different variants tailored for different application scenarios have also been deeply researched in the literature. Among them, the linearized ADMM has received particularly wide attention from many areas because of its efficiency and easy implementation. To theoretically guarantee … Read more

Chambolle-Pock and Tseng’s methods: relationship and extension to the bilevel optimization

In the first part of the paper we focus on two problems: (a) regularized least squares and (b) nonsmooth minimization over an affine subspace. For these problems we establish the connection between the primal-dual method of Chambolle-Pock and Tseng’s proximal gradient method. For problem (a) it allows us to derive a nonergodic $O(1/k^2)$ convergence rate … Read more

Asynchronous Parallel Algorithms for Nonconvex Big-Data Optimization. Part II: Complexity and Numerical Results

We present complexity and numerical results for a new asynchronous parallel algorithmic method for the minimization of the sum of a smooth nonconvex function and a convex nonsmooth regularizer, subject to both convex and nonconvex constraints. The proposed method hinges on successive convex approximation techniques and a novel probabilistic model that captures key elements of … Read more

Convergence rates of moment-sum-of-squares hierarchies for volume approximation of semialgebraic sets

Moment-sum-of-squares hierarchies of semidefinite programs can be used to approximate the volume of a given compact basic semialgebraic set $K$. The idea consists of approximating from above the indicator function of $K$ with a sequence of polynomials of increasing degree $d$, so that the integrals of these polynomials generate a convergence sequence of upper bounds … Read more

Positive-Indefinite Proximal Augmented Lagrangian Method and its Application to Full Jacobian Splitting for Multi-block Separable Convex Minimization Problems

The augmented Lagrangian method (ALM) is fundamental for solving convex programming problems with linear constraints. The proximal version of ALM, which regularizes ALM’s subproblem over the primal variable at each iteration by an additional positive-definite quadratic proximal term, has been well studied in the literature. In this paper, we show that it is not necessary … Read more