On accelerated methods for tensor canonical polyadic decomposition

Бесплатный доступ

We present Nesterov acceleration techniques for alternating least squares (ALS) methods applied to canonical tensor decomposition. The tensor decomposition problem is nonconvex. Thus, for convergence guarantee, we use a certain version of the Nesterov acceleration by adding a momentum term with a specific weight sequence determined by onedimensional search. We consider a restart mechanism to overcome the numerical instability caused by line search that enables effective acceleration. Our extensive empirical results show that the Nesterov-accelerated ALS methods with restart can be more efficient than the stand-alone ALS when problems are ill-conditioned. There is a clear potential to extend our Nesterov-type acceleration approach to accelerating other optimization algorithms than ALS applied to other nonconvex problems such as the Tucker tensor decomposition.

Еще

Canonical tensor decomposition, alternating least squares, nesterov acceleration, nonconvex optimization

Короткий адрес: https://sciup.org/142230095

IDR: 142230095

Статья научная