Optimality conditions for the nonlinear programming problems on Riemannian manifolds

In recent years, many traditional optimization methods have been successfully generalized to minimize objective functions on manifolds. In this paper, we first extend the general traditional constrained optimization problem to a nonlinear programming problem built upon a general Riemannian manifold $\mathcal{M}$, and discuss the first-order and the second-order optimality conditions. By exploiting the differential geometry … Read more

Aubin Property and Uniqueness of Solutions in Cone Constrained Optimization

We discuss conditions for the Aubin property of solutions to perturbed cone constrained programs, by using and refining results given in \cite{KlaKum02}. In particular, we show that constraint nondegeneracy and hence uniqueness of the multiplier is necessary for the Aubin property of the critical point map. Moreover, we give conditions under which the critical point … Read more

Deriving robust counterparts of nonlinear uncertain inequalities

In this paper we provide a systematic way to construct the robust counterpart of a nonlinear uncertain inequality that is concave in the uncertain parameters. We use convex analysis (support functions, conjugate functions, Fenchel duality) and conic duality in order to convert the robust counterpart into an explicit and computationally tractable set of constraints. It … Read more

A GLOBALLY CONVERGENT STABILIZED SQP METHOD

Sequential quadratic programming (SQP) methods are a popular class of methods for nonlinearly constrained optimization. They are particularly effective for solving a sequence of related problems, such as those arising in mixed-integer nonlinear programming and the optimization of functions subject to differential equation constraints. Recently, there has been considerable interest in the formulation of \emph{stabilized} … Read more

A Family of Second-Order Methods for Convex L1-Regularized Optimization

This paper is concerned with the minimization of an objective that is the sum of a convex function $f$ and an $\ell_1$ regularization term. Our interest is in methods that incorporate second-order information about the function $f$ to accelerate convergence. We describe a semi-smooth Newton framework that can be used to generate a variety of … Read more

A quasi-Newton proximal splitting method

A new result in convex analysis on the calculation of proximity operators in certain scaled norms is derived. We describe efficient implementations of the proximity calculation for a useful class of functions; the implementations exploit the piece-wise linear nature of the dual problem. The second part of the paper applies the previous result to acceleration … Read more

Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms

In this paper we study new stochastic approximation (SA) type algorithms, namely, the accelerated SA (AC-SA), for solving strongly convex stochastic composite optimization (SCO) problems. Specifically, by introducing a domain shrinking procedure, we significantly improve the large-deviation results associated with the convergence rate of a nearly optimal AC-SA algorithm presented by the authors. Moreover, we … Read more

Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic Programming

In this paper, we introduce a new stochastic approximation (SA) type algorithm, namely the randomized stochastic gradient (RSG) method, for solving an important class of nonlinear (possibly nonconvex) stochastic programming (SP) problems. We establish the complexity of this method for computing an approximate stationary point of a nonlinear programming problem. We also show that this … Read more

Sensitivity analysis for relaxed optimal control problems with final-state constraints

In this article, we compute a second-order expansion of the value function of a family of relaxed optimal control problems with final-state constraints, parameterized by a perturbation variable. The sensitivity analysis is performed for controls that we call R-strong solutions. They are optimal solutions with respect to the set of feasible controls with a uniform … Read more

Analytical formulas for calculating extremal ranks and inertias of quadratic matrix-valued functions

group of analytical formulas formulas for calculating the global maximal and minimal ranks and inertias of the quadratic matrix-valued function $$ \phi(X) = \left(\, AXB + C\,\right)\!M\!\left(\, AXB + C \right)^{*} + D $$ are established and their consequences are presented, where $A$, $B$, $C$ and $D$ are given complex matrices with $A$ and $C$ … Read more