TY - GEN
T1 - Probabilistic neural-kernel tensor decomposition
AU - Tillinghast, Conor
AU - Fang, Shikai
AU - Zhang, Kai
AU - Zhe, Shandian
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/11
Y1 - 2020/11
N2 - Tensor decomposition is a fundamental framework to model and analyze multiway data, which are ubiquitous in realworld applications. A critical challenge of tensor decomposition is to capture a variety of complex relationships/interactions while avoiding overfitting the data that are usually very sparse. Although numerous tensor decomposition methods have been proposed, they are mostly based on a multilinear form and hence are incapable of estimating more complex, nonlinear relationships. To address the challenge, we propose POND, PrObabilistic Neural-kernel tensor Decomposition that unifies the self-adaptation of Bayes nonparametric function learning and the expressive power of neural networks. POND uses Gaussian processes (GPs) to model the hidden relationships and can automatically detect their complexity in tensors, preventing both underfitting and overfitting. POND then incorporates convolutional neural networks to construct the GP kernel to greatly promote the capability of estimating highly nonlinear relationships. To scale POND to large data, we use the sparse variational GP framework and reparameterization trick to develop an efficient stochastic variational learning algorithm. On both synthetic and real-world benchmark datasets, POND often exhibits better predictive performance than the state-of-the-art nonlinear tensor decomposition methods. In addition, as a Bayesian approach, POND provides the posterior distribution of the latent factors, and hence can conveniently quantify their uncertainty and the confidence levels for predictions.
AB - Tensor decomposition is a fundamental framework to model and analyze multiway data, which are ubiquitous in realworld applications. A critical challenge of tensor decomposition is to capture a variety of complex relationships/interactions while avoiding overfitting the data that are usually very sparse. Although numerous tensor decomposition methods have been proposed, they are mostly based on a multilinear form and hence are incapable of estimating more complex, nonlinear relationships. To address the challenge, we propose POND, PrObabilistic Neural-kernel tensor Decomposition that unifies the self-adaptation of Bayes nonparametric function learning and the expressive power of neural networks. POND uses Gaussian processes (GPs) to model the hidden relationships and can automatically detect their complexity in tensors, preventing both underfitting and overfitting. POND then incorporates convolutional neural networks to construct the GP kernel to greatly promote the capability of estimating highly nonlinear relationships. To scale POND to large data, we use the sparse variational GP framework and reparameterization trick to develop an efficient stochastic variational learning algorithm. On both synthetic and real-world benchmark datasets, POND often exhibits better predictive performance than the state-of-the-art nonlinear tensor decomposition methods. In addition, as a Bayesian approach, POND provides the posterior distribution of the latent factors, and hence can conveniently quantify their uncertainty and the confidence levels for predictions.
KW - Bayesian nonparametrics
KW - Kernel
KW - Neural networks
KW - Tensor decomposition
UR - https://www.scopus.com/pages/publications/85100880400
U2 - 10.1109/ICDM50108.2020.00062
DO - 10.1109/ICDM50108.2020.00062
M3 - 会议稿件
AN - SCOPUS:85100880400
T3 - Proceedings - IEEE International Conference on Data Mining, ICDM
SP - 531
EP - 540
BT - Proceedings - 20th IEEE International Conference on Data Mining, ICDM 2020
A2 - Plant, Claudia
A2 - Wang, Haixun
A2 - Cuzzocrea, Alfredo
A2 - Zaniolo, Carlo
A2 - Wu, Xindong
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 20th IEEE International Conference on Data Mining, ICDM 2020
Y2 - 17 November 2020 through 20 November 2020
ER -