An optimization problem for dynamic OD trip matrix estimation on transit networks with different types of data collection units

Dynamic O-D trip matrices for public transportation systems provide a valuable source of information of the usage of public transportation system that may be used either by planners for a better design of the transportation facilities or by the administrations in order to characterize the efficiency of the transport system both in peak hours and … Read more

Rank computation in Euclidean Jordan algebras

Euclidean Jordan algebras are the abstract foundation for symmetriccone optimization. Every element in a Euclidean Jordan algebra has a complete spectral decomposition analogous to the spectral decomposition of a real symmetric matrix into rank-one projections. The spectral decomposition in a Euclidean Jordan algebra stems from the likewise-analogous characteristic polynomial of its elements, whose degree is … Read more

FISTA and Extensions – Review and New Insights

The purpose of this technical report is to review the main properties of an accelerated composite gradient (ACG) method commonly referred to as the Fast Iterative Shrinkage-Thresholding Algorithm (FISTA). In addition, we state a version of FISTA for solving both convex and strongly convex composite minimization problems and derive its iteration complexities to generate iterates … Read more

A new stopping criterion for Krylov solvers applied in Interior Point Methods

A surprising result is presented in this paper with possible far reaching consequences for any optimization technique which relies on Krylov subspace methods employed to solve the underlying linear equation systems. In this paper the advantages of the new technique are illustrated in the context of Interior Point Methods (IPMs). When an iterative method is … Read more

Dealing with inequality constraints in large-scale semidefinite relaxations for graph coloring and maximum clique problems

Semidefinite programs (SDPs) can be solved in polynomial time by interior point methods. However, when the dimension of the problem gets large, interior point methods become impractical in terms of both computational time and memory requirements. Certain first-order methods, such as Alternating Direction Methods of Multipliers (ADMMs), established as suitable algorithms to deal with large-scale … Read more

Frank-Wolfe and friends: a journey into projection-free first-order optimization methods

Invented some 65 years ago in a seminal paper by Marguerite Straus-Frank and Philip Wolfe, the Frank-Wolfe method recently enjoys a remarkable revival, fuelled by the need of fast and reliable first-order optimization methods in Data Science and other relevant application areas. This review tries to explain the success of this approach by illustrating versatility … Read more

Analysis of the Frank-Wolfe Method for Convex Composite Optimization involving a Logarithmically-Homogeneous Barrier

We present and analyze a new generalized Frank-Wolfe method for the composite optimization problem (P): F*:= min_x f(Ax) + h(x), where f is a \theta-logarithmically-homogeneous self-concordant barrier and the function h has bounded domain but is possibly non-smooth. We show that our generalized Frank-Wolfe method requires O((Gap_0 + \theta + Var_h)\ln(\delta_0) + (\theta + Var_h)^2/\epsilon) … Read more

Practical Large-Scale Linear Programming using Primal-Dual Hybrid Gradient

We present PDLP, a practical first-order method for linear programming (LP) that can solve to the high levels of accuracy that are expected in traditional LP applications. In addition, it can scale to very large problems because its core operation is matrix-vector multiplications. PDLP is derived by applying the primal-dual hybrid gradient (PDHG) method, popularized … Read more

Hashing embeddings of optimal dimension, with applications to linear least squares

The aim of this paper is two-fold: firstly, to present subspace embedding properties for s-hashing sketching matrices, with $s\geq 1$, that are optimal in the projection dimension $m$ of the sketch, namely, $m=O(d)$, where $d$ is the dimension of the subspace. A diverse set of results are presented that address the case when the input … Read more

MIMO Radar Optimization With Constant-Modulus and Any p-Norm Similarity Constraints

MIMO radar plays a key role in autonomous driving, and the similarity waveform constraint is an important constraint for radar waveform design. However, the joint constant-modulus and similarity constraint is a difficult constraint. Only the special case with $\infty$-norm similarity and constant-modulus constraints is tackled by the semidefinite relaxation (SDR) and the successive quadratic refinement … Read more