Band Gap Optimization of Two-Dimensional Photonic Crystals Using Semidefinite Programming and Subspace Methods

In this paper, we consider the optimal design of photonic crystal band structures for two-dimensional square lattices. The mathematical formulation of the band gap optimization problem leads to an infinite-dimensional Hermitian eigenvalue optimization problem parametrized by the dielectric material and the wave vector. To make the problem tractable, the original eigenvalue problem is discretized using … Read more

An Augmented Lagrangian Approach for Sparse Principal Component Analysis

Principal component analysis (PCA) is a widely used technique for data analysis and dimension reduction with numerous applications in science and engineering. However, the standard PCA suffers from the fact that the principal components (PCs) are usually linear combinations of all the original variables, and it is thus often difficult to interpret the PCs. To … Read more

Old Wine in a New Bottle: The MILP Road to MIQCP

This paper surveys results on the NP-hard mixed-integer quadratically constrained programming problem. The focus is strong convex relaxations and valid inequalities, which can become the basis of efficient global techniques. In particular, we discuss relaxations and inequalities arising from the algebraic description of the problem as well as from dynamic procedures based on disjunctive programming. … Read more

Sample Average Approximation for Stochastic Dominance Constrained Programs

In this paper we study optimization problems with second-order stochastic dominance constraints. This class of problems has been receiving increasing attention in the literature as it allows for the modeling of optimization problems where a risk-averse decision maker wants to ensure that the solution produced by the model dominates certain benchmarks. Here we deal with … Read more

Convergence and Convergence Rate of Stochastic Gradient Search in the Case of Multiple and Non-Isolated Extrema

The asymptotic behavior of stochastic gradient algorithms is studied. Relying on some results of differential geometry (Lojasiewicz gradient inequality), the almost sure point-convergence is demonstrated and relatively tight almost sure bounds on the convergence rate are derived. In sharp contrast to all existing result of this kind, the asymptotic results obtained here do not require … Read more

The Farkas Lemma Revisited

The Farkas Lemma is extended to simultaneous linear operator and polyhedral sublinear operator inequalities by Boolean valued analysis. CitationSobolev Institute of Mathematics, Novosibirsk, 630090 RussiaArticleDownload View PDF

A Combined Class of Self-Scaling and Modified Quasi-Newton Methods

Techniques for obtaining safely positive definite Hessian approximations with self-scaling and modified quasi-Newton updates are combined to obtain `better’ curvature approximations in line search methods for unconstrained optimization. It is shown that this class of methods, like the BFGS method has global and superlinear convergence for convex functions. Numerical experiments with this class, using the … Read more