Directional H”older metric subregularity and application to tangent cones

In this work, we study directional versions of the H\”olderian/Lipschitzian metric subregularity of multifunctions. Firstly, we establish variational characterizations of the H\”olderian/Lipschitzian directional metric subregularity by means of the strong slopes and next of mixed tangency-coderivative objects . By product, we give second-order conditions for the directional Lipschitzian metric subregularity and for the directional metric … Read more

HIPAD – A Hybrid Interior-Point Alternating Direction algorithm for knowledge-based SVM and feature selection

We consider classification tasks in the regime of scarce labeled training data in high dimensional feature space, where specific expert knowledge is also available. We propose a new hybrid optimization algorithm that solves the elastic-net support vector machine (SVM) through an alternating direction method of multipliers in the first phase, followed by an interior-point method … Read more

Nonlinear local error bounds via a change of metric

In this work, we improve the approach of Corvellec-Motreanu to nonlinear error bounds for lowersemicontinuous functions on complete metric spaces, an approach consisting in reducing the nonlinear case to the linear one through a change of metric. This improvement is basically a technical one, and allows dealing with local error bounds in an appropriate way. … Read more

Conditional Gradient Sliding for Convex Optimization

In this paper, we present a new conditional gradient type method for convex optimization by utilizing a linear optimization (LO) oracle to minimize a series of linear functions over the feasible set. Different from the classic conditional gradient method, the conditional gradient sliding (CGS) algorithm developed herein can skip the computation of gradients from time … Read more

Iteration Bounds for Finding the $\epsilonhBcStationary Points for Structured Nonconvex Optimization

In this paper we study proximal conditional-gradient (CG) and proximal gradient-projection type algorithms for a block-structured constrained nonconvex optimization model, which arises naturally from tensor data analysis. First, we introduce a new notion of $\epsilon$-stationarity, which is suitable for the structured problem under consideration. %, compared with other similar solution concepts. We then propose two … Read more

Variational analysis and full stability of optimal solutions to constrained and minimax problems

The main goal of this paper is to develop applications of advanced tools of first-order and second-order variational analysis and generalized differentiation to the fundamental notion of full stability of local minimizers of general classes of constrained optimization and minimax problems. In particular, we derive second-order characterizations of full stability and investigate its relationships with … Read more

An induction theorem and nonlinear regularity models

A general nonlinear regularity model for a set-valued mapping $F:X\times\R_+\rightrightarrows Y$, where $X$ and $Y$ are metric spaces, is considered using special iteration procedures, going back to Banach, Schauder, Lusternik and Graves. Namely, we revise the \emph{induction theorem} from Khanh, \emph{J. Math. Anal. Appl.}, 118 (1986) and employ it to obtain basic estimates for studying … Read more

The global weak sharp minima with explicit exponents in polynomial vector optimization problems

In this paper we discuss the global weak sharp minima property for vector optimization problems with polynomial data. Exploiting the imposed polynomial structure together with tools of variational analysis and a quantitative version of \L ojasiewicz’s gradient inequality due to D’Acunto and Kurdyka, we establish the H\”older type global weak sharp minima with explicitly calculated … Read more

Convergence rate analysis of the forward-Douglas-Rachford splitting scheme

Operator splitting schemes are a class of powerful algorithms that solve complicated monotone inclusion and convex optimization problems that are built from many simpler pieces. They give rise to algorithms in which all simple pieces of the decomposition are processed individually. This leads to easily implementable and highly parallelizable or distributed algorithms, which often obtain … Read more

Interior-point solver for convex separable block-angular problems

Constraints matrices with block-angular structures are pervasive in Optimization. Interior-point methods have shown to be competitive for these structured problems by exploiting the linear algebra. One of these approaches solved the normal equations using sparse Cholesky factorizations for the block constraints, and a preconditioned conjugate gradient (PCG) for the linking constraints. The preconditioner is based … Read more