Convexity and continuity of specific set-valued maps and their extremal value functions

In this paper, we study several classes of set-valued maps, which can be used in set-valued optimization and its applications, and their respective maximum and minimum value functions. The definitions of these maps are based on scalar-valued, vector-valued, and cone-valued maps. Moreover, we consider those extremal value functions which are obtained when optimizing linear functionals … Read more

On convexity and quasiconvexity of extremal value functions in set optimization

We study different classes of convex and quasiconvex set-valued maps defined by means of the lower-less order relation and the upper-less order relation. The aim of this paper is to formulate necessary and especially sufficient conditions for the convexity/quasiconvexity of extremal value functions. Citation DOI: 10.23952/asvao.3.2021.3.04 Article Download View On convexity and quasiconvexity of extremal … Read more

Calculating Optimistic Likelihoods Using (Geodesically) Convex Optimization

A fundamental problem arising in many areas of machine learning is the evaluation of the likelihood of a given observation under different nominal distributions. Frequently, these nominal distributions are themselves estimated from data, which makes them susceptible to estimation errors. We thus propose to replace each nominal distribution with an ambiguity set containing all distributions … Read more

Characterizations of explicitly quasiconvex vector functions w.r.t. polyhedral cones

The aim of this paper is to present new characterizations of explicitly cone-quasiconvex vector functions with respect to a polyhedral cone of a finite-dimensional Euclidean space. These characterizations are given in terms of classical explicit quasiconvexity of certain real-valued functions, defined by composing the vector-valued function with appropriate scalarization functions, namely the extreme directions of … Read more

Fast and Faster Convergence of SGD for Over-Parameterized Models and an Accelerated Perceptron

Modern machine learning focuses on highly expressive models that are able to fit or interpolate the data completely, resulting in zero training loss. For such models, we show that the stochastic gradients of common loss functions satisfy a strong growth condition. Under this condition, we prove that constant step-size stochastic gradient descent (SGD) with Nesterov … Read more

A fundamental proof to convergence analysis of alternating direction method of multipliers for weakly convex optimization

The convergence analysis of the alternating direction method of multipliers (ADMM) methods to convex/nonconvex combinational optimization have been well established in literature. Consider the extensive applications of weakly convex function in signal processing and machine learning(e.g. \textit{Special issue: DC-Theory, Algorithms and Applications, Mathematical Programming, 169(1):1-336,2018}), in this paper, we investigate the convergence analysis of ADMM … Read more

Pareto efficient solutions in multi-objective optimization involving forbidden regions

In this paper, the aim is to compute Pareto efficient solutions of multi-objective optimization problems involving forbidden regions. More precisely, we assume that the vector-valued objective function is componentwise generalized-convex and acts between a real topological linear pre-image space and a finite-dimensional image space, while the feasible set is given by the whole pre-image space … Read more

Inner Conditions for Error Bounds and Metric Subregulerity of Multifunctions

We introduce a new class of sets, functions and multifunctions which is shown to be large and to enjoy some nice common properties with the convex setting. Error bounds for objects attached to this class are characterized in terms of inner conditions of Abadie’s type, that is conditions bearing on normal cones and coderivatives at … Read more

A Hausdorff-type distance, a directional derivative of a set-valued map and applications in set optimization

In this paper, we follow Kuroiwa’s set approach in set optimization, which proposes to compare values of a set-valued objective map $F$ respect to various set order relations. We introduce a Hausdorff-type distance relative to an ordering cone between two sets in a Banach space and use it to define a directional derivative for $F$. … Read more

A Note on the Forward-Douglas–Rachford Splitting for Monotone Inclusion and Convex Optimization

We shed light on the structure of the “three-operator” version of the forward-Douglas–Rachford splitting algorithm for finding a zero of a sum of maximally monotone operators $A + B + C$, where $B$ is cocoercive, involving only the computation of $B$ and of the resolvent of $A$ and of $C$, separately. We show that it … Read more