Generic properties for semialgebraic programs

In this paper we study genericity for the following parameterized class of nonlinear programs: \begin{eqnarray*} \textrm{minimize } f_u(x) := f(x) – \langle u, x \rangle \quad \textrm{subject to } \quad x \in S, \end{eqnarray*} where $f \colon \mathbb{R}^n \rightarrow \mathbb{R}$ is a polynomial function and $S \subset \mathbb{R}^n$ is a closed semialgebraic set, which is … Read more

A Theoretical and Algorithmic Characterization of Bulge Knees

This paper deals with the problem of finding convex bulges on the Pareto-front of a multi-objective optimization problem. The point of maximum bulge is of particular interest as this point shows good trade-off properties and it is also close to the non-attainable utopia point. Our approach is to use a population based algorithm to simultaneously … Read more

A second-order sequential optimality condition associated to the convergence of optimization algorithms

Sequential optimality conditions have recently played an important role on the analysis of the global convergence of optimization algorithms towards first-order stationary points and justifying their stopping criteria. In this paper we introduce the first sequential optimality condition that takes into account second-order information. We also present a companion constraint qualification that is less stringent … Read more

A proximal gradient method for ensemble density functional theory

The ensemble density functional theory is valuable for simulations of metallic systems due to the absence of a gap in the spectrum of the Hamiltonian matrices. Although the widely used self-consistent field iteration method can be extended to solve the minimization of the total energy functional with respect to orthogonality constraints, there is no theoretical … Read more

New results on subgradient methods for strongly convex optimization problems with a unified analysis

We develop subgradient- and gradient-based methods for minimizing strongly convex functions under a notion which generalizes the standard Euclidean strong convexity. We propose a unifying framework for subgradient methods which yields two kinds of methods, namely, the Proximal Gradient Method (PGM) and the Conditional Gradient Method (CGM), unifying several existing methods. The unifying framework provides … Read more

An extension of the projected gradient method to a Banach space setting with application in structural topology optimization

For the minimization of a nonlinear cost functional under convex constraints the relaxed projected gradient process is a well known method. The analysis is classically performed in a Hilbert space. We generalize this method to functionals which are differentiable in a Banach space. The search direction is calculated by a quadratic approximation of the cost … Read more

Partial Relaxation of Equality-constrained Programs

This paper presents a reformulation that is a natural “by-product” of the ‘variable endogenization’ process for equality-constrained programs. The method results a partial relaxation of the constraints which in turn confers some computational advantages. A fully-annotated example illustrates the technique and presents some comparative numerical results. Citation Siwale, I.: Partial Relaxation of Equality-constrained Programs. Technical … Read more

Optimality and complexity for constrained optimization problems with nonconvex regularization

In this paper, we consider a class of constrained optimization problems where the feasible set is a general closed convex set and the objective function has a nonsmooth, nonconvex regularizer. Such regularizer includes widely used SCAD, MCP, logistic, fraction, hard thresholding and non-Lipschitz $L_p$ penalties as special cases. Using the theory of the generalized directional … Read more

Copositivity for second-order optimality conditions in general smooth optimization problems

Second-order local optimality conditions involving copositivity of the Hessian of the Lagrangian on the reduced linearization cone have the advantage that there is only a small gap between sufficient (the Hessian is strictly copositive) and necessary (the Hessian is copositive) conditions. In this respect, this is a proper generalization of convexity of the Lagrangian. We … Read more