TY - GEN
T1 - Flexible-Order Feature-Interaction for Mixed Continuous and Discrete Variables with Group-Level Interpretability
AU - Zhai, Zijie
AU - Shen, Junchen
AU - Li, Ping
AU - Zhang, Jie
AU - Zhang, Kai
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025.
PY - 2025
Y1 - 2025
N2 - Deep neural networks have shown remarkable performance across diverse machine learning tasks. However, the balance between predictive accuracy and model interpretability remains a persistent challenge: high-performing models often exhibit complex structures defying human understanding, while interpretable (concise) models may sacrifice performance. In this paper, we show that feature interaction can be a crucial perspective when pursuing such balance, and propose flexible-order feature-interaction (FOFI), a new approach to exploit grouped feature interactions as the key to building accurate yet interpretable models. FOFI encourages local feature interactions that are organized into groups, which allows model capacity (parameters) to be distributed in a nuanced manner: at the lower granularity, dense interactions are restricted locally within each group to account for the complexity (performance); at the higher granularity, a flat predictive function is defined at group-level that guarantees the overall interpretability. Furthermore, FOFI is versatile in accommodating feature interactions of arbitrary order among mixed continuous and categorical variables. Extensive experiments on both simulated and real-world datasets showcase the encouraging performance and interpretability of FOFI.
AB - Deep neural networks have shown remarkable performance across diverse machine learning tasks. However, the balance between predictive accuracy and model interpretability remains a persistent challenge: high-performing models often exhibit complex structures defying human understanding, while interpretable (concise) models may sacrifice performance. In this paper, we show that feature interaction can be a crucial perspective when pursuing such balance, and propose flexible-order feature-interaction (FOFI), a new approach to exploit grouped feature interactions as the key to building accurate yet interpretable models. FOFI encourages local feature interactions that are organized into groups, which allows model capacity (parameters) to be distributed in a nuanced manner: at the lower granularity, dense interactions are restricted locally within each group to account for the complexity (performance); at the higher granularity, a flat predictive function is defined at group-level that guarantees the overall interpretability. Furthermore, FOFI is versatile in accommodating feature interactions of arbitrary order among mixed continuous and categorical variables. Extensive experiments on both simulated and real-world datasets showcase the encouraging performance and interpretability of FOFI.
KW - Feature Interaction
KW - Interpretability
KW - Neural Networks
UR - https://www.scopus.com/pages/publications/105008674839
U2 - 10.1007/978-981-96-6576-1_4
DO - 10.1007/978-981-96-6576-1_4
M3 - 会议稿件
AN - SCOPUS:105008674839
SN - 9789819665754
T3 - Lecture Notes in Computer Science
SP - 42
EP - 57
BT - Neural Information Processing - 31st International Conference, ICONIP 2024, Proceedings
A2 - Mahmud, Mufti
A2 - Doborjeh, Maryam
A2 - Wong, Kevin
A2 - Leung, Andrew Chi Sing
A2 - Doborjeh, Zohreh
A2 - Tanveer, M.
PB - Springer Science and Business Media Deutschland GmbH
T2 - 31st International Conference on Neural Information Processing, ICONIP 2024
Y2 - 2 December 2024 through 6 December 2024
ER -