On Stable Piecewise Linearization and Generalized Algorithmic Differentiation

It is shown how functions that are defined by evaluation programs involving the absolute value function (besides smooth elementals), can be approximated locally by piecewise-linear models in the style of algorithmic, or automatic, differentiation (AD). The model can be generated by a minor modification of standard AD tools and it is Lipschitz continuous with respect … Read more

Second-order variational analysis and characterizations of tilt-stable optimal solutions in finite and infinite dimensions

The paper is devoted to developing second-order tools of variational analysis and their applications to characterizing tilt-stable local minimizers of constrained optimization problems in finite-dimensional and infinite-dimensional spaces. The importance of tilt stability has been well recognized from both theoretical and numerical aspects of optimization. Based on second-order generalized differentiation, we obtain qualitative and quantitative … Read more

On the Global and Linear Convergence of the Generalized Alternating Direction Method of Multipliers

The formulation min f(x)+g(y) subject to Ax+By=b, where f and g are extended-value convex functions, arises in many application areas such as signal processing, imaging and image processing, statistics, and machine learning either naturally or after variable splitting. In many common problems, one of the two objective functions is strictly convex and has Lipschitz continuous … Read more

An Efficient Augmented Lagrangian Method with Applications to Total Variation Minimization

Based on the classic augmented Lagrangian multiplier method, we propose, analyze and test an algorithm for solving a class of equality-constrained non-smooth optimization problems (chiefly but not necessarily convex programs) with a particular structure. The algorithm effectively combines an alternating direction technique with a nonmonotone line search to minimize the augmented Lagrangian function at each … Read more

A Block Coordinate Descent Method for Regularized Multi-Convex Optimization with Applications to Nonnegative Tensor Factorization and Completion

This paper considers regularized block multi-convex optimization, where the feasible set and objective function are generally non-convex but convex in each block of variables. We review some of its interesting examples and propose a generalized block coordinate descent method. (Using proximal updates, we further allow non-convexity over some blocks.) Under certain conditions, we show that … Read more

CHARACTERIZATIONS OF FULL STABILITY IN CONSTRAINED OPTIMIZATION

This paper is mainly devoted to the study of the so-called full Lipschitzian stability of local solutions to finite-dimensional parameterized problems of constrained optimization, which has been well recognized as a very important property from both viewpoints of optimization theory and its applications. Based on second- order generalized differential tools of variational analysis, we obtain … Read more

Some criteria for error bounds in set optimization

We obtain sufficient and/or necessary conditions for global/local error bounds for the distances to some sets appeared in set optimization studied with both the set approach and vector approach (sublevel sets, constraint sets, sets of {\it all } Pareto efficient/ Henig proper efficient/super efficient solutions, sets of solutions {\it corresponding to one} Pareto efficient/Henig proper … Read more

An adaptive accelerated first-order method for convex optimization

This paper presents a new accelerated variant of Nesterov’s method for solving composite convex optimization problems in which certain acceleration parameters are adaptively (and aggressively) chosen so as to substantially improve its practical performance compared to existing accelerated variants while at the same time preserve the optimal iteration-complexity shared by these methods. Computational results are … Read more

Complexity Analysis of Interior Point Algorithms for Non-Lipschitz and Nonconvex Minimization

We propose a first order interior point algorithm for a class of non-Lipschitz and nonconvex minimization problems with box constraints, which arise from applications in variable selection and regularized optimization. The objective functions of these problems are continuously differentiable typically at interior points of the feasible set. Our algorithm is easy to implement and the … Read more

Optimality, identifiability, and sensitivity

Around a solution of an optimization problem, an “identifiable” subset of the feasible region is one containing all nearby solutions after small perturbations to the problem. A quest for only the most essential ingredients of sensitivity analysis leads us to consider identifiable sets that are “minimal”. This new notion lays a broad and intuitive variational-analytic … Read more