TY - JOUR
T1 - Advancing Personalized Learning with Neural Collapse for Long-Tail Challenge
AU - Hu, Hanglei
AU - Guo, Yingying
AU - Chen, Zhikang
AU - Cui, Sen
AU - Wu, Fei
AU - Kuang, Kun
AU - Zhang, Min
AU - Jiang, Bo
N1 - Publisher Copyright:
© 2020 by the authors.
PY - 2025
Y1 - 2025
N2 - Personalized learning, especially data-based methods, has garnered widespread attention in recent years, aiming to meet individual student needs. However, many works rely on the implicit assumption that benchmarks are high-quality and well-annotated, which limits their practical applicability. In real-world scenarios, these benchmarks often exhibit long-tail distributions, significantly impacting model performance. To address this challenge, we propose a novel method called Neural-Collapse-Advanced personalized Learning (NCAL), designed to learn features that conform to the same simplex equiangular tight frame (ETF) structure. NCAL introduces textmodality collapse (TC) regularization to optimize the distribution of text embeddings within the large language model (LLM) representation space. Notably, NCAL is model-agnostic, making it compatible with various architectures and approaches, thereby ensuring broad applicability. Extensive experiments demonstrate that NCAL effectively enhances existing works, achieving new state-ofthe- art performance. Additionally, NCAL mitigates class imbalance, significantly improving the model’s generalization ability. Code is available at https://github.com/llm4edu/NCAL_ICML2025.git.
AB - Personalized learning, especially data-based methods, has garnered widespread attention in recent years, aiming to meet individual student needs. However, many works rely on the implicit assumption that benchmarks are high-quality and well-annotated, which limits their practical applicability. In real-world scenarios, these benchmarks often exhibit long-tail distributions, significantly impacting model performance. To address this challenge, we propose a novel method called Neural-Collapse-Advanced personalized Learning (NCAL), designed to learn features that conform to the same simplex equiangular tight frame (ETF) structure. NCAL introduces textmodality collapse (TC) regularization to optimize the distribution of text embeddings within the large language model (LLM) representation space. Notably, NCAL is model-agnostic, making it compatible with various architectures and approaches, thereby ensuring broad applicability. Extensive experiments demonstrate that NCAL effectively enhances existing works, achieving new state-ofthe- art performance. Additionally, NCAL mitigates class imbalance, significantly improving the model’s generalization ability. Code is available at https://github.com/llm4edu/NCAL_ICML2025.git.
UR - https://www.scopus.com/pages/publications/105023577630
M3 - 会议文章
AN - SCOPUS:105023577630
SN - 2640-3498
VL - 267
SP - 24314
EP - 24327
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
T2 - 42nd International Conference on Machine Learning, ICML 2025
Y2 - 13 July 2025 through 19 July 2025
ER -