Graph regularized sparse nonnegative Tucker decomposition with ℓ0-constraints for unsupervised learning
-
Graphical Abstract
-
Abstract
Nonnegative Tucker decomposition (NTD) is a powerful feature extraction tool widely used in dimensionality reduction and clustering of multi-dimensional data. In our paper, we propose a novel graph regularized sparse nonnegative Tucker decomposition with ℓ0-norm constraints (ℓ0-GSNTD) method. Unlike most existing sparse NTD methods, which overlook the manifold structure of data and uncontrollably promote the sparsity of the core tensor and factor matrices by using a relaxation scheme of ℓ0-norm regularization, our method incorporates the graph regularization into NTD to encode the manifold structure information of data and directly employs the ℓ0-norm constraints to explicitly control the sparsity of the core tensor and factor matrices in NTD, thereby enhancing the feature extraction capability. However, due to the nonconvex nature of NTD and the nonconvex and nonsmooth nature of the ℓ0-norm constraints, optimizing ℓ0-GSNTD is NP-hard. To tackle these challenges, we propose a Proximal Alternating Linearized (PAL) algorithm to solve the original ℓ0-GSNTD, and introduce the inertial version of PAL algorithm named inertial PAL (iPAL) algorithm to accelerate convergence. Our algorithms provide a practical convergent scheme to directly solve ℓ0-GSNTD without relaxing its constraints. Furthermore, we prove that the sequence generated by our algorithms is globally convergent to a critical point and analyze the per iteration complexity of our algorithms. The experimental results on the unsupervised clustering tasks using ten real-world benchmark datasets demonstrate that our method outperforms some state-of-the-art methods.
-
-