We present Nesterov acceleration techniques for alternating least squares (ALS) methods applied to canonical tensor decomposition. The tensor decomposition problem is nonconvex. Thus, we use a specific version of the Nesterov acceleration for convergence guarantee by adding a momentum term with a particular weight sequence determined by a one-dimensional search.

We consider a restart mechanism to overcome the numerical instability caused by line search that enables effective acceleration.

Our extensive empirical results show that the Nesterov-accelerated ALS methods with restart can be more efficient than the stand-alone ALS when problems are ill-conditioned. There is a clear potential to extend our Nesterov-type acceleration approach to accelerating other optimization algorithms than ALS applied to other nonconvex problems such as the Tucker tensor decomposition