On accelerated methods for tensor canonical polyadic decomposition

Posted in Papers

We present Nesterov acceleration techniques for alternating least squares (ALS) methods applied to canonical tensor decomposition. The tensor decomposition problem is nonconvex. Thus, we use a specific version of the Nesterov acceleration for convergence guarantee by adding a momentum term with a particular weight sequence determined by a one-dimensional search.

We consider a restart mechanism to overcome the numerical instability caused by line search that enables effective acceleration.

Our extensive empirical results show that the Nesterov-accelerated ALS methods with restart can be more efficient than the stand-alone ALS when problems are ill-conditioned. There is a clear potential to extend our Nesterov-type acceleration approach to accelerating other optimization algorithms than ALS applied to other nonconvex problems such as the Tucker tensor decomposition

I am PhD student at Skolkovo Institute of Science and Technology and Senior Lecturer at Moscow Institute of Physics and Technology. I love math, teaching, history, travelling and the ambiguity of being.

Read Next

Лучшие проекты по оптимизации 2020