Advancing Personalized Learning with Neural Collapse for Long-Tail Challenge

  • Hanglei Hu
  • , Yingying Guo
  • , Zhikang Chen
  • , Sen Cui
  • , Fei Wu
  • , Kun Kuang
  • , Min Zhang*
  • , Bo Jiang*
  • *Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

Personalized learning, especially data-based methods, has garnered widespread attention in recent years, aiming to meet individual student needs. However, many works rely on the implicit assumption that benchmarks are high-quality and well-annotated, which limits their practical applicability. In real-world scenarios, these benchmarks often exhibit long-tail distributions, significantly impacting model performance. To address this challenge, we propose a novel method called Neural-Collapse-Advanced personalized Learning (NCAL), designed to learn features that conform to the same simplex equiangular tight frame (ETF) structure. NCAL introduces textmodality collapse (TC) regularization to optimize the distribution of text embeddings within the large language model (LLM) representation space. Notably, NCAL is model-agnostic, making it compatible with various architectures and approaches, thereby ensuring broad applicability. Extensive experiments demonstrate that NCAL effectively enhances existing works, achieving new state-ofthe- art performance. Additionally, NCAL mitigates class imbalance, significantly improving the model’s generalization ability. Code is available at https://github.com/llm4edu/NCAL_ICML2025.git.

Original languageEnglish
Pages (from-to)24314-24327
Number of pages14
JournalProceedings of Machine Learning Research
Volume267
StatePublished - 2025
Event42nd International Conference on Machine Learning, ICML 2025 - Vancouver, Canada
Duration: 13 Jul 202519 Jul 2025

Fingerprint

Dive into the research topics of 'Advancing Personalized Learning with Neural Collapse for Long-Tail Challenge'. Together they form a unique fingerprint.

Cite this