Many idealized problems in signal processing, machine learning and statistics can be reduced to the problem of finding the symmetric canonical decomposition of an underlying symmetric and orthogonally decomposable (SOD) tensor. Drawing inspiration from the matrix case, the successive rank-one approximations (SROA) scheme has been proposed and shown to yield this tensor decomposition exactly, and a plethora of numerical methods have thus been developed for the tensor rank-one approximation problem. In practice, however, the inevitable errors (say) from estimation, computation, and modeling entail that the input tensor can only be assumed to be a nearly SOD tensor---i.e., a symmetric tensor slightly perturbed from the underlying SOD tensor. This article shows that even in the presence of perturbation, SROA can still robustly recover the symmetric canonical decomposition of the underlying tensor. It is shown that when the perturbation error is small enough, the approximation errors do not accumulate with the iteration number. Numerical results are presented to support the theoretical findings.
Article
View Successive Rank-One Approximations of Nearly Orthogonally Decomposable Symmetric Tensors