A Hausdorff-type distance, a directional derivative of a set-valued map and applications in set optimization

In this paper, we follow Kuroiwa’s set approach in set optimization, which proposes to compare values of a set-valued objective map $F$ respect to various set order relations. We introduce a Hausdorff-type distance relative to an ordering cone between two sets in a Banach space and use it to define a directional derivative for $F$. … Read more

Facially dual complete (nice) cones and lexicographic tangents

We study the boundary structure of closed convex cones, with a focus on facially dual complete (nice) cones. These cones form a proper subset of facially exposed convex cones, and they behave well in the context of duality theory for convex optimization. Using the well-known and very commonly used concept of tangent cones in nonlinear … Read more

Partially separable convexly-constrained optimization with non-Lipschitz singularities and its complexity

An adaptive regularization algorithm using high-order models is proposed for partially separable convexly constrained nonlinear optimization problems whose objective function contains non-Lipschitzian $\ell_q$-norm regularization terms for $q\in (0,1)$. It is shown that the algorithm using an $p$-th order Taylor model for $p$ odd needs in general at most $O(\epsilon^{-(p+1)/p})$ evaluations of the objective function and … Read more

Several variants of the primal-dual hybrid gradient algorithm with applications

By reviewing the primal-dual hybrid algorithm (PDHA) proposed by He, You and Yuan (SIAM J. Imaging Sci. 2014;7(4):2526-2537), in this paper we introduce four improved schemes for solving a class of generalized saddle-point problems. By making use of the variational inequality, weaker conditions are presented to ensure the global convergence of the proposed algorithms, where … Read more

Iteration-Complexity of a Linearized Proximal Multiblock ADMM Class for Linearly Constrained Nonconvex Optimization Problems

This paper analyzes the iteration-complexity of a class of linearized proximal multiblock alternating direction method of multipliers (ADMM) for solving linearly constrained nonconvex optimization problems. The subproblems of the linearized ADMM are obtained by partially or fully linearizing the augmented Lagrangian with respect to the corresponding minimizing block variable. The derived complexity bounds do not … Read more

Structural Properties of Affine Sparsity Constraints

We introduce a new constraint system for sparse variable selection in statistical learning. Such a system arises when there are logical conditions on the sparsity of certain unknown model parameters that need to be incorporated into their selection process. Formally, extending a cardinality constraint, an affine sparsity constraint (ASC) is defined by a linear inequality … Read more

Generalized Self-Concordant Functions: A Recipe for Newton-type Methods

We study the smooth structure of convex functions by generalizing a powerful concept so-called \textit{self-concordance} introduced by Nesterov and Nemirovskii in the early 1990s to a broader class of convex functions, which we call \textit{generalized self-concordant functions}. This notion allows us to develop a unified framework for designing Newton-type methods to solve convex optimization problems. … Read more

On generalized-convex constrained multi-objective optimization

In this paper, we consider multi-objective optimization problems involving not necessarily convex constraints and componentwise generalized-convex (e.g., semi-strictly quasi-convex, quasi-convex, or explicitly quasi-convex) vector-valued objective functions that are acting between a real linear topological pre-image space and a finite dimensional image space. For these multi-objective optimization problems, we show that the set of (strictly, weakly) … Read more

Linear Convergence of Proximal Incremental Aggregated Gradient Methods under Quadratic Growth Condition

Under the strongly convex assumption, several recent works studied the global linear convergence rate of the proximal incremental aggregated gradient (PIAG) method for minimizing the sum of a large number of smooth component functions and a non-smooth convex function. In this paper, under the quadratic growth condition{a strictly weaker condition than the strongly convex assumption, … Read more

Radial Subgradient Descent

We present a subgradient method for minimizing non-smooth, non-Lipschitz convex optimization problems. The only structure assumed is that a strictly feasible point is known. We extend the work of Renegar [1] by taking a different perspective, leading to an algorithm which is conceptually more natural, has notably improved convergence rates, and for which the analysis … Read more