A Penalty-free Infeasible Approach for a Class of Nonsmooth Optimization Problems over the Stiefel Manifold

Transforming into an exact penalty function model with convex compact constraints yields efficient infeasible approaches for optimization problems with orthogonality constraints. For smooth and L21-norm regularized cases, these infeasible approaches adopt simple and orthonormalization-free updating schemes and show high efficiency in some numerical experiments. However, to avoid orthonormalization while enforcing the feasibility of the final … Read more

Controllable Transmission Networks UnderDemand Uncertainty with Modular FACTS

The transmission system operators (TSOs) are responsible to provide secure and efficient access to the transmission system for all stakeholders. This task is gradually getting challenging due to the demand growth, demand uncertainty, rapid changes in generation mix, and market policies. Traditionally, the TSOs try to maximize the technical performance of the transmission network via … Read more

A Framework of Inertial Alternating Direction Method of Multipliers for Non-Convex Non-Smooth Optimization

In this paper, we propose an algorithmic framework dubbed inertial alternating direction methods of multipliers (iADMM), for solving a class of nonconvex nonsmooth multiblock composite optimization problems with linear constraints. Our framework employs the general minimization-majorization (MM) principle to update each block of variables so as to not only unify the convergence analysis of previous … Read more

Sparse Approximations with Interior Point Methods

Large-scale optimization problems that seek sparse solutions have become ubiquitous. They are routinely solved with various specialized first-order methods. Although such methods are often fast, they usually struggle with not-so-well conditioned problems. In this paper, specialized variants of an interior point-proximal method of multipliers are proposed and analyzed for problems of this class. Computational experience … Read more

A General Framework for Optimal Control of Fractional Nonlinear Delay Systems by Wavelets

An iterative procedure to find the optimal solutions of general fractional nonlinear delay systems with quadraticperformance indices is introduced. The derivatives of state equations are understood in the Caputo sense. By presenting and applying a general framework, we use the Chebyshev wavelet method developed for fractional linear optimal control to convert fractional nonlinear optimal control … Read more

On the Numerical Performance of Derivative-Free Optimization Methods Based on Finite-Difference Approximations

The goal of this paper is to investigate an approach for derivative-free optimization that has not received sufficient attention in the literature and is yet one of the simplest to implement and parallelize. It consists of computing gradients of a smoothed approximation of the objective function (and constraints), and employing them within established codes. These … Read more

Beyond local optimality conditions: the case of maximizing a convex function

In this paper, we design an algorithm for maximizing a convex function over a convex feasible set. The algorithm consists of two phases: in phase 1 a feasible solution is obtained that is used as an initial starting point in phase 2. In the latter, a biconvex problem equivalent to the original problem is solved … Read more

An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems

In this paper, we propose a new method for a class of difference-of-convex (DC) optimization problems, whose objective is the sum of a smooth function and a possibly non-prox-friendly DC function. The method sequentially solves subproblems constructed from a quadratic approximation of the smooth function and a linear majorization of the concave part of the … Read more

Direct-Search for a Class of Stochastic Min-Max Problems

Recent applications in machine learning have renewed the interest of the community in min-max optimization problems. While gradient-based optimization methods are widely used to solve such problems, there are however many scenarios where these techniques are not well-suited, or even not applicable when the gradient is not accessible. We investigate the use of direct-search methods … Read more

Scalable Subspace Methods for Derivative-Free Nonlinear Least-Squares Optimization

We introduce a general framework for large-scale model-based derivative-free optimization based on iterative minimization within random subspaces. We present a probabilistic worst-case complexity analysis for our method, where in particular we prove high-probability bounds on the number of iterations before a given optimality is achieved. This framework is specialized to nonlinear least-squares problems, with a … Read more