An Alternating Manifold Proximal Gradient Method for Sparse PCA and Sparse CCA

Sparse principal component analysis (PCA) and sparse canonical correlation analysis (CCA) are two essential techniques from high-dimensional statistics and machine learning for analyzing large-scale data. Both problems can be formulated as an optimization problem with nonsmooth objective and nonconvex constraints. Since non-smoothness and nonconvexity bring numerical difficulties, most algorithms suggested in the literature either solve … Read more

Efficient Derivative Evaluation for Rigid-body Dynamics based on Recursive Algorithms subject to Kinematic and Loop Constraints

Simulation, optimization and control of robotic and bio-mechanical systems depend on a mathematical model description, typically a rigid-body system connected by joints, for which efficient algorithms to compute the forward or inverse dynamics exist. Models that e.g.\ include spring-damper systems are subject to both kinematic and loop constraints. Gradient-based optimization and control methods require derivatives … Read more

High-Order Evaluation Complexity for Convexly-Constrained Optimization with Non-Lipschitzian Group Sparsity Terms

This paper studies high-order evaluation complexity for partially separable convexly-constrained optimization involving non-Lipschitzian group sparsity terms in a nonconvex objective function. We propose a partially separable adaptive regularization algorithm using a $p$-th order Taylor model and show that the algorithm can produce an (epsilon,delta)-approximate q-th-order stationary point in at most O(epsilon^{-(p+1)/(p-q+1)}) evaluations of the objective … Read more

A Framework for Peak Shaving Through the Coordination of Smart Homes

In demand–response programs, aggregators balance the needs of generation companies and end-users. This work proposes a two-phase framework that shaves the aggregated peak loads while maintaining the desired comfort level for users. In the first phase, the users determine their planned consumption. For the second phase, we develop a bilevel model with mixed-integer variables and … Read more

Recovery of a mixture of Gaussians by sum-of-norms clustering

Sum-of-norms clustering is a method for assigning $n$ points in $\R^d$ to $K$ clusters, $1\le K\le n$, using convex optimization. Recently, Panahi et al.\ proved that sum-of-norms clustering is guaranteed to recover a mixture of Gaussians under the restriction that the number of samples is not too large. The purpose of this note is to … Read more

Optimal Residential Battery Storage Operations Using Robust Data-driven Dynamic Programming

In this paper, we consider the problem of operating a battery storage unit in a home with a rooftop solar photovoltaic (PV) system so as to minimize expected long-run electricity costs under uncertain electricity usage, PV generation, and electricity prices. Solving this dynamic program using standard techniques is computationally burdensome, and is often complicated by … Read more

A scalable mixed-integer decomposition approach for optimal power system restoration

The optimal restoration problem lies at the foundation of the evaluation and improvement of resilience in power systems. In this paper we present a scalable decomposition algorithm, based on the integer L-shaped method, for solving this problem for realistic power systems. The algorithm works by partitioning the problem into a master problem and a slave … Read more

Non-asymptotic Results for Langevin Monte Carlo: Coordinate-wise and Black-box Sampling

Euler-Maruyama and Ozaki discretization of a continuous time diffusion process is a popular technique for sampling, that uses (upto) gradient and Hessian information of the density respectively. The Euler-Maruyama discretization has been used particularly for sampling under the name of Langevin Monte Carlo (LMC) for sampling from strongly log-concave densities. In this work, we make … Read more

A study of rank-one sets with linear side constraints and application to the pooling problem

We study sets defined as the intersection of a rank-1 constraint with different choices of linear side constraints. We identify different conditions on the linear side constraints, under which the convex hull of the rank-1 set is polyhedral or second-order cone representable. In all these cases, we also show that a linear objective can be … Read more

Rank-one Convexification for Sparse Regression

Sparse regression models are increasingly prevalent due to their ease of interpretability and superior out-of-sample performance. However, the exact model of sparse regression with an L0 constraint restricting the support of the estimators is a challenging non-convex optimization problem. In this paper, we derive new strong convex relaxations for sparse regression. These relaxations are based … Read more