Some criteria for error bounds in set optimization

We obtain sufficient and/or necessary conditions for global/local error bounds for the distances to some sets appeared in set optimization studied with both the set approach and vector approach (sublevel sets, constraint sets, sets of {\it all } Pareto efficient/ Henig proper efficient/super efficient solutions, sets of solutions {\it corresponding to one} Pareto efficient/Henig proper … Read more

Complexity Analysis of Interior Point Algorithms for Non-Lipschitz and Nonconvex Minimization

We propose a first order interior point algorithm for a class of non-Lipschitz and nonconvex minimization problems with box constraints, which arise from applications in variable selection and regularized optimization. The objective functions of these problems are continuously differentiable typically at interior points of the feasible set. Our algorithm is easy to implement and the … Read more

A QCQP Approach to Triangulation

Triangulation of a three-dimensional point from $n\ge 2$ two-dimensional images can be formulated as a quadratically constrained quadratic program. We propose an algorithm to extract candidate solutions to this problem from its semidefinite programming relaxations. We then describe a sufficient condition and a polynomial time test for certifying when such a solution is optimal. This … Read more

A first-order block-decomposition method for solving two-easy-block structured semidefinite programs

In this paper, we consider a first-order block-decomposition method for minimizing the sum of a convex differentiable function with Lipschitz continuous gradient, and two other proper closed convex (possibly, nonsmooth) functions with easily computable resolvents. The method presented contains two important ingredients from a computational point of view, namely: an adaptive choice of stepsize for … Read more

Optimality, identifiability, and sensitivity

Around a solution of an optimization problem, an “identifiable” subset of the feasible region is one containing all nearby solutions after small perturbations to the problem. A quest for only the most essential ingredients of sensitivity analysis leads us to consider identifiable sets that are “minimal”. This new notion lays a broad and intuitive variational-analytic … Read more

Stochastic optimization and sparse statistical recovery: An optimal algorithm for high dimensions

We develop and analyze stochastic optimization algorithms for problems in which the expected loss is strongly convex, and the optimum is (approximately) sparse. Previous approaches are able to exploit only one of these two structures, yielding an $\order(\pdim/T)$ convergence rate for strongly convex objectives in $\pdim$ dimensions, and an $\order(\sqrt{(\spindex \log \pdim)/T})$ convergence rate when … Read more

An acceleration procedure for optimal first-order methods

We introduce in this paper an optimal first-order method that allows an easy and cheap evaluation of the local Lipschitz constant of the objective’s gradient. This constant must ideally be chosen at every iteration as small as possible, while serving in an indispensable upper bound for the value of the objective function. In the previously … Read more

Optimality conditions for the nonlinear programming problems on Riemannian manifolds

In recent years, many traditional optimization methods have been successfully generalized to minimize objective functions on manifolds. In this paper, we first extend the general traditional constrained optimization problem to a nonlinear programming problem built upon a general Riemannian manifold $\mathcal{M}$, and discuss the first-order and the second-order optimality conditions. By exploiting the differential geometry … Read more

A variable smoothing algorithm for solving convex optimization problems

In this article we propose a method for solving unconstrained optimization problems with convex and Lipschitz continuous objective functions. By making use of the Moreau envelopes of the functions occurring in the objective, we smooth the latter to a convex and differentiable function with Lipschitz continuous gradient by using both variable and constant smoothing parameters. … Read more