TY - GEN
T1 - IO-aware Factorization Machine for User Response Prediction
AU - Hu, Zhenhao
AU - Peng, Chao
AU - He, Cheng
AU - Cai, Haibin
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/7
Y1 - 2020/7
N2 - As a supervised learning method, Factorization Machine (FM) is famous for its capability of modeling feature interactions. However, FM's performance might be bad if we assign the same weight to all feature interactions, as not all of them are equally useful and productive. Attentional Factorization Machine (AFM) improves FM by discriminating the importance of distinctive feature interactions via a neural attention network. Nevertheless, the neural attention network in AFM is not fine-grained enough and it ignores the information of the fields implied by the features, which limits the performance of the model. In this work, we propose a novel model named IO-aware Factorization Machine (IOFM), which enhances the feature representation ability of attention mechanism in estimating weights via two awareness auxiliary matrices. To make the model more efficient, we further reduce the model parameters using canonical decomposition for the two auxiliary matrices and design a shared matrix to correlate the decomposed matrices. Extensive experiments on two real-world datasets indicate the superiority of our IOFM model over the state-of-the-art methods.
AB - As a supervised learning method, Factorization Machine (FM) is famous for its capability of modeling feature interactions. However, FM's performance might be bad if we assign the same weight to all feature interactions, as not all of them are equally useful and productive. Attentional Factorization Machine (AFM) improves FM by discriminating the importance of distinctive feature interactions via a neural attention network. Nevertheless, the neural attention network in AFM is not fine-grained enough and it ignores the information of the fields implied by the features, which limits the performance of the model. In this work, we propose a novel model named IO-aware Factorization Machine (IOFM), which enhances the feature representation ability of attention mechanism in estimating weights via two awareness auxiliary matrices. To make the model more efficient, we further reduce the model parameters using canonical decomposition for the two auxiliary matrices and design a shared matrix to correlate the decomposed matrices. Extensive experiments on two real-world datasets indicate the superiority of our IOFM model over the state-of-the-art methods.
KW - Factorization Machines
KW - Neural Attention Network
KW - Recommender Systems
KW - User Response Prediction
UR - https://www.scopus.com/pages/publications/85093833208
U2 - 10.1109/IJCNN48605.2020.9207424
DO - 10.1109/IJCNN48605.2020.9207424
M3 - 会议稿件
AN - SCOPUS:85093833208
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 International Joint Conference on Neural Networks, IJCNN 2020
Y2 - 19 July 2020 through 24 July 2020
ER -