Fast Presolving Framework For Sparsity Constrained Convex Quadratic Programming: Screening-Based Cut Generation and Selection

Screening is widely utilized for Mixed-Integer Programming (MIP) presolving. It aims to certify a priori whether one or multiple specific binary variables can be fixed to optimal values based on solutions from convex relaxations. This paper studies the challenge of solving Sparsity-constrained (strongly) Convex Quadratic Programming (SCQP) and proposes the Screening-based Cut Presolving Framework (SCPF). … Read more

Curvature-oriented variance reduction methods for nonconvex stochastic optimization

When pursuing an approximate second-order stationary point in nonconvex constrained stochastic optimization, is it possible to design a stochastic second-order method that achieves the same sample complexity order as in the unconstrained setting? To address this question in this paper, we first introduce Carme, a curvature-oriented variance reduction method designed for unconstrained nonconvex stochastic optimization. … Read more

Voronoi Conditional Gradient Method for Constrained Nonconvex Optimization

The Conditional Gradient method offers a computationally efficient, projection-free framework for constrained problems; however, in nonconvex settings it may converge to stationary points of low quality. We propose the Voronoi Conditional Gradient (VCG) method, a geometric heuristic that systematically explores the feasible region by constructing adaptive Voronoi partitions from previously discovered stationary points. VCG incrementally … Read more

Sensitivity-informed identification of temperature-dependent piezoelectric material parameters

An accurate characterization of temperature-dependent material parameters of piezoceramics is crucial for the design and simulation of reliable sensors and actuators. This characterization is typically formulated as an ill-posed inverse problem, which is challenging to solve not only because of its ill-posedness, but also because of parameter sensitivities, which vary by several orders of magnitude … Read more

Riemannian Dueling Optimization

Dueling optimization considers optimizing an objective with access to only a comparison oracle of the objective function. It finds important applications in emerging fields such as recommendation systems and robotics. Existing works on dueling optimization mainly focused on unconstrained problems in the Euclidean space. In this work, we study dueling optimization over Riemannian manifolds, which … Read more

An objective-function-free algorithm for general smooth constrained optimization

A new algorithm for smooth constrained optimization is proposed that never computes the value of the problem’s objective function and that handles both equality and inequality constraints. The algorithm uses an adaptive switching strategy between a normal step aiming at reducing constraint’s infeasibility and a tangential step improving dual optimality, the latter being inspired by … Read more

Learning to Choose Branching Rules for Nonconvex MINLPs

Outer-approximation-based branch-and-bound is a common algorithmic framework for solving MINLPs (mixed-integer nonlinear programs) to global optimality, with branching variable selection critically influencing overall performance. In modern global MINLP solvers, it is unclear whether branching on fractional integer variables should be prioritized over spatial branching on variables, potentially continuous, that show constraint violations, with different solvers … Read more

Objective-Function Free Multi-Objective Optimization: Rate of Convergence and Performance of an Adagrad-like algorithm

We propose an Adagrad-like algorithm for multi-objective unconstrained optimization that relies on the computation of a common descent direction only. Unlike classical local algorithms for multi-objective optimization, our approach does not rely on the dominance property to accept new iterates, which allows for a flexible and function-free optimization framework. New points are obtained using an … Read more

An adaptive line-search-free multiobjective gradient method and its iteration-complexity analysis

This work introduces an Adaptive Line-Search-Free Multiobjective Gradient (AMG) method for solving smooth multiobjective optimization problems. The proposed approach automatically adjusts stepsizes based on steepest descent directions, promoting robustness with respect to stepsize choice while maintaining low computational cost. The method is specifically tailored to the multiobjective setting and does not rely on function evaluations, … Read more

On Approximate Computation of Critical Points

We show that computing even very coarse approximations of critical points is intractable for simple classes of nonconvex functions. More concretely, we prove that if there exists a polynomial-time algorithm that takes as input a polynomial in \(n\) variables of constant degree (as low as three) and outputs a point whose gradient has Euclidean norm … Read more