A four-operator splitting algorithm for nonconvex and nonsmooth optimization

\(\) In this work, we address a class of nonconvex nonsmooth optimization problems where the objective function is the sum of two smooth functions (one of which is proximable) and two nonsmooth functions (one weakly convex and proximable and the other continuous and weakly concave). We introduce a new splitting algorithm that extends the Davis-Yin … Read more

Complexity of Adagrad and other first-order methods for nonconvex optimization problems with bounds and convex constraints

A parametric class of trust-region algorithms for constrained nonconvex optimization is analyzed, where the objective function is never computed. By defining appropriate first-order stationarity criteria, we are able to extend the Adagrad method to the newly considered problem and retrieve the standard complexity rate of the projected gradient method that uses both the gradient and … Read more

Composite optimization models via proximal gradient method with increasing adaptive stepsizes

We first consider the convex composite optimization models without globally Lipschitz condition imposed on the gradient of the differentiable term. The classical method which is proximal gradient will be studied with our new strategy of stepsize selection. The idea for constructing such a stepsize is motivated by the one in \cite{hoai} that used for gradient … Read more

Accelerated Fully First-Order Methods for Bilevel and Minimax Optimization

We present in this paper novel accelerated fully first-order methods in \emph{Bilevel Optimization} (BiO). Firstly, for BiO under the assumption that the lower-level functions admit the typical strong convexity assumption, the \emph{(Perturbed) Restarted Accelerated Fully First-order methods for Bilevel Approximation} (\PRAFFBA) algorithm leveraging \emph{fully} first-order oracles is proposed, whereas the algorithm for finding approximate first-order … Read more

Optimization without Retraction on the Random Generalized Stiefel Manifold

\(\) Optimization over the set of matrices \(X\) that satisfy \(X^\top B X = I_p\), referred to as the generalized Stiefel manifold, appears in many applications involving sampled covariance matrices such as the canonical correlation analysis (CCA), independent component analysis (ICA), and the generalized eigenvalue problem (GEVP). Solving these problems is typically done by iterative … Read more

Floorplanning with I/O assignment via feasibility-seeking and superiorization methods

The feasibility-seeking approach offers a systematic framework for managing and resolving intricate constraints in continuous problems, making it a promising avenue to explore in the context of floorplanning problems with increasingly heterogeneous constraints. The classic legality constraints can be expressed as the union of convex sets. However, conventional projection-based algorithms for feasibility-seeking do not guarantee … Read more

A mathematical introduction to SVMs with self-concordant kernel

A derivation of so-called “soft-margin Support Vector Machines with kernel” is presented which does not rely on concepts from functional analysis such as Mercer’s theorem that is frequently cited in this context, and that leads to a new analysis of the continuity properties of the kernel functions such as a new self-concordance condition for the … Read more

solar: A solar thermal power plant simulator for blackbox optimization benchmarking

This work introduces solar, a collection of  ten optimization problem instances for benchmarking blackbox optimization solvers. The instances present different design aspects of a concentrated solar power plant simulated by blackbox numerical models. The type of variables (discrete or continuous), dimensionality, and number and types of constraints (including hidden constraints)  differ across instances. Some are deterministic, others are stochastic … Read more

Exploiting cone approximations in an augmented Lagrangian method for conic optimization

We propose an algorithm for general nonlinear conic programming which does not require the knowledge of the full cone, but rather a simpler, more tractable, approximation of it. We prove that the algorithm satisfies a strong global convergence property in the sense that it generates a strong sequential optimality condition. In particular, a KKT point … Read more