A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization

We introduce a family of weighted conjugate-gradient-type methods, for strictly convex quadratic functions, whose parameters are determined by a minimization model based on a convex combination of the objective function and its gradient norm. This family includes the classical linear conjugate gradient method and the recently published delayed weighted gradient method as the extreme cases … Read more

Strengthened SDP Relaxation for an Extended Trust Region Subproblem with an Application to Optimal Power Flow

We study an extended trust region subproblem minimizing a nonconvex function over the hollow ball $r \le \|x\| \le R$ intersected with a full-dimensional second order cone (SOC) constraint of the form $\|x – c\| \le b^T x – a$. In particular, we present a class of valid cuts that improve existing semidefinite programming (SDP) … Read more

Global optimality in minimum compliance topology optimization of frames and shells by moment-sum-of-squares hierarchy

The design of minimum-compliance bending-resistant structures with continuous cross-section parameters is a challenging task because of its inherent non-convexity. Our contribution develops a strategy that facilitates computing all guaranteed globally optimal solutions for frame and shell structures under multiple load cases and self-weight. To this purpose, we exploit the fact that the stiffness matrix is … Read more

Mixed-Projection Conic Optimization: A New Paradigm for Modeling Rank Constraints

We propose a framework for modeling and solving low-rank optimization problems to certifiable optimality. We introduce symmetric projection matrices that satisfy $Y^2 = Y$, the matrix analog of binary variables that satisfy $z^2 = z$, to model rank constraints. By leveraging regularization and strong duality, we prove that this modeling paradigm yields tractable convex optimization … Read more

A structured modified Newton approach for solving systems of nonlinear equations arising in interior-point methods for quadratic programming

The focus in this work is interior-point methods for quadratic optimization problems with linear inequality constraints where the system of nonlinear equations that arise are solved with Newton-like methods. In particular, the concern is the system of linear equations to be solved at each iteration. Newton systems give high quality solutions but there is an … Read more

Spectral Residual Method for Nonlinear Equations on Riemannian Manifolds

In this paper, the spectral algorithm for nonlinear equations (SANE) is adapted to the problem of finding a zero of a given tangent vector field on a Riemannian manifold. The generalized version of SANE uses, in a systematic way, the tangent vector field as a search direction and a continuous real–valued function that adapts this … Read more

Using first-order information in Direct Multisearch for multiobjective optimization

Derivatives are an important tool for single-objective optimization. In fact, it is commonly accepted that derivative-based methods present a better performance than derivative-free optimization approaches. In this work, we will show that the same does not apply to multiobjective derivative-based optimization, when the goal is to compute an approximation to the complete Pareto front of … Read more

Regret Minimization in Stochastic Non-Convex Learning via a Proximal-Gradient Approach

Motivated by applications in machine learning and operations research, we study regret minimization with stochastic first-order oracle feedback in online constrained, and possibly non-smooth, non-convex problems. In this setting, the minimization of external regret is beyond reach, so we focus on a local regret measures defined via a proximal-gradient residual mapping. To achieve no (local) … Read more

Economic inexact restoration for derivative-free expensive function minimization and applications

The Inexact Restoration approach has proved to be an adequate tool for handling the problem of minimizing an expensive function within an arbitrary feasible set by using different degrees of precision in the objective function. The Inexact Restoration framework allows one to obtain suitable convergence and complexity results for an approach that rationally combines low- … Read more

Constrained global optimization of functions with low effective dimensionality using multiple random embeddings

We consider the bound-constrained global optimization of functions with low effective dimensionality, that are constant along an (unknown) linear subspace and only vary over the effective (complement) subspace. We aim to implicitly explore the intrinsic low dimensionality of the constrained landscape using feasible random embeddings, in order to understand and improve the scalability of algorithms … Read more