Partially separable convexly-constrained optimization with non-Lipschitz singularities and its complexity

An adaptive regularization algorithm using high-order models is proposed for partially separable convexly constrained nonlinear optimization problems whose objective function contains non-Lipschitzian $\ell_q$-norm regularization terms for $q\in (0,1)$. It is shown that the algorithm using an $p$-th order Taylor model for $p$ odd needs in general at most $O(\epsilon^{-(p+1)/p})$ evaluations of the objective function and … Read more

Distributionally Robust Mechanism Design

We study a mechanism design problem where an indivisible good is auctioned to multiple bidders, for each of whom it has a private value that is unknown to the seller and the other bidders. The agents perceive the ensemble of all bidder values as a random vector governed by an ambiguous probability distribution, which belongs … Read more

Several variants of the primal-dual hybrid gradient algorithm with applications

By reviewing the primal-dual hybrid algorithm (PDHA) proposed by He, You and Yuan (SIAM J. Imaging Sci. 2014;7(4):2526-2537), in this paper we introduce four improved schemes for solving a class of generalized saddle-point problems. By making use of the variational inequality, weaker conditions are presented to ensure the global convergence of the proposed algorithms, where … Read more

Bilevel optimization with a multiobjective problem in the lower level

Bilevel problems model instances with a hierarchical structure. Aiming at an efficient solution of a constrained multiobjective problem according with some pre-defined criterion, we reformulate this optimization but non standard problem as a classic bilevel one. This reformulation intents to encompass all the objectives, so that the properly efficient solution set is recovered by means … Read more

Iteration-Complexity of a Linearized Proximal Multiblock ADMM Class for Linearly Constrained Nonconvex Optimization Problems

This paper analyzes the iteration-complexity of a class of linearized proximal multiblock alternating direction method of multipliers (ADMM) for solving linearly constrained nonconvex optimization problems. The subproblems of the linearized ADMM are obtained by partially or fully linearizing the augmented Lagrangian with respect to the corresponding minimizing block variable. The derived complexity bounds do not … Read more

Structural Properties of Affine Sparsity Constraints

We introduce a new constraint system for sparse variable selection in statistical learning. Such a system arises when there are logical conditions on the sparsity of certain unknown model parameters that need to be incorporated into their selection process. Formally, extending a cardinality constraint, an affine sparsity constraint (ASC) is defined by a linear inequality … Read more

Small and Strong Formulations for Unions of Convex Sets from the Cayley Embedding

There is often a significant trade-off between formulation strength and size in mixed integer programming (MIP). When modeling convex disjunctive constraints (e.g. unions of convex sets), adding auxiliary continuous variables can sometimes help resolve this trade-off. However, standard formulations that use such auxiliary continuous variables can have a worse-than-expected computational effectiveness, which is often attributed … Read more

From Data to Decisions: Distributionally Robust Optimization is Optimal

We study stochastic programs where the decision-maker cannot observe the distribution of the exogenous uncertainties but has access to a finite set of independent samples from this distribution. In this setting, the goal is to find a procedure that transforms the data to an estimate of the expected cost function under the unknown data-generating distribution, … Read more

A Bregman alternating direction method of multipliers for sparse probabilistic Boolean network problem

The main task of genetic regulatory networks is to construct a sparse probabilistic Boolean network (PBN) based on a given transition-probability matrix and a set of Boolean networks (BNs). In this paper, a Bregman alternating direction method of multipliers (BADMM) is proposed to solve the minimization problem raised in PBN. All the customized subproblem-solvers of … Read more

Generalized Self-Concordant Functions: A Recipe for Newton-type Methods

We study the smooth structure of convex functions by generalizing a powerful concept so-called \textit{self-concordance} introduced by Nesterov and Nemirovskii in the early 1990s to a broader class of convex functions, which we call \textit{generalized self-concordant functions}. This notion allows us to develop a unified framework for designing Newton-type methods to solve convex optimization problems. … Read more