A Two-level ADMM Algorithm for AC OPF with Convergence Guarantees

This paper proposes a two-level distributed algorithmic framework for solving the AC optimal power flow (OPF) problem with convergence guarantees. The presence of highly nonconvex constraints in OPF poses significant challenges to distributed algorithms based on the alternating direction method of multipliers (ADMM). In particular, convergence is not provably guaranteed for nonconvex network optimization problems … Read more

Iteration-complexity of a proximal augmented Lagrangian method for solving nonconvex composite optimization problems with nonlinear convex constraints

This paper proposes and analyzes a proximal augmented Lagrangian (NL-IAPIAL) method for solving smooth nonconvex composite optimization problems with nonlinear K-convex constraints, i.e., the constraints are convex with respect to the order given by a closed convex cone K. Each NL-IAPIAL iteration consists of inexactly solving a proximal augmented Lagrangian subproblem by an accelerated composite … Read more

On the best achievable quality of limit points of augmented Lagrangian schemes

The optimization literature is vast in papers dealing with improvements on the global convergence of augmented Lagrangian schemes. Usually, the results are based on weak constraint qualifications, or, more recently, on sequential optimality conditions obtained via penalization techniques. In this paper we propose a somewhat different approach, in the sense that the algorithm itself is … Read more

Exact Penalty Function for L21 Norm Minimization over the Stiefel Manifold

L21 norm minimization with orthogonality constraints, feasible region of which is called Stiefel manifold, has wide applications in statistics and data science. The state-of-the-art approaches adopt proximal gradient technique on either Stiefel manifold or its tangent spaces. The consequent subproblem does not have closed-form solution and hence requires an iterative procedure to solve which is … Read more

A Nonmonotone Matrix-Free Algorithm for Nonlinear Equality-Constrained Least-Squares Problems

Least squares form one of the most prominent classes of optimization problems, with numerous applications in scientific computing and data fitting. When such formulations aim at modeling complex systems, the optimization process must account for nonlinear dynamics by incorporating constraints. In addition, these systems often incorporate a large number of variables, which increases the difficulty … Read more

Riemannian Optimization on the Symplectic Stiefel Manifold

The symplectic Stiefel manifold, denoted by $\mathrm{Sp}(2p,2n)$, is the set of linear symplectic maps between the standard symplectic spaces $\mathbb{R}^{2p}$ and $\mathbb{R}^{2n}$. When $p=n$, it reduces to the well-known set of $2n\times 2n$ symplectic matrices. Optimization problems on $\mathrm{Sp}(2p,2n)$ find applications in various areas, such as optics, quantum physics, numerical linear algebra and model order … Read more

Optimality conditions in discrete-continuous nonlinear optimization

This paper presents necessary and sufficient optimality conditions for discrete-continuous nonlinear optimization problems including mixed-integer nonlinear problems. This theory does not utilize an extension of the Lagrange theory of continuous optimization but it works with certain max functionals for a separation of two sets where one of them is nonconvex. These functionals have the advantage … Read more

High-order Evaluation Complexity of a Stochastic Adaptive Regularization Algorithm for Nonconvex Optimization Using Inexact Function Evaluations and Randomly Perturbed Derivatives

A stochastic adaptive regularization algorithm allowing random noise in derivatives and inexact function values is proposed for computing strong approximate minimizers of any order for inexpensively constrained smooth optimization problems. For an objective function with Lipschitz continuous p-th derivative in a convex neighbourhood of the feasible set and given an arbitrary optimality order q, it … Read more

A Primal–Dual Penalty Method via Rounded Weighted-\boldmath{$\ell_1$} Lagrangian Duality

We propose a new duality scheme based on a sequence of smooth minorants of the weighted-$\ell_1$ penalty function, interpreted as a parametrized sequence of augmented Lagrangians, to solve nonconvex and nonsmooth constrained optimization problems. For the induced sequence of dual problems, we establish strong asymptotic duality properties. Namely, we show that (i) the sequence of … Read more

Iteration-complexity of an inexact proximal accelerated augmented Lagrangian method for solving linearly constrained smooth nonconvex composite optimization problems

This paper proposes and establishes the iteration-complexity of an inexact proximal accelerated augmented Lagrangian (IPAAL) method for solving linearly constrained smooth nonconvex composite optimization problems. Each IPAAL iteration consists of inexactly solving a proximal augmented Lagrangian subproblem by an accelerated composite gradient (ACG) method followed by a suitable Lagrange multiplier update. It is shown that … Read more