High-Order Contrastive Learning with Fine-grained Comparative Levels for Sparse Ordinal Tensor Completion

Yu Dai, Junchen Shen, Zijie Zhai, Danlin Liu, Jingyang Chen, Yu Sun, Ping Li, Jie Zhang, Kai Zhang

Research output: Contribution to journalConference articlepeer-review

Abstract

Contrastive learning is a powerful paradigm for representation learning with wide applications in vision and NLP, but how to extend its success to high-dimensional tensors remains a challenge. This is because tensor data often exhibit high-order mode-interactions that are hard to profile and with negative samples growing combinatorially fast; besides, many real-world tensors have ordinal entries that necessitate more delicate comparative levels. We propose High-Order Contrastive Tensor Completion (HOCTC) to extend contrastive learning to sparse ordinal tensor regression. HOCTC employs a novel attention-based strategy with query-expansion to capture high-order mode interactions even in case of very limited tokens, which transcends beyond second-order learning scenarios. Besides, it extends two-level comparisons (positive-vs-negative) to fine-grained contrast-levels using ordinal tensor entries as a natural guidance. Efficient sampling scheme is proposed to enforce such delicate comparative structures, generating comprehensive self-supervised signals for high-order representation learning. Experiments show that HOCTC has promising results in sparse tensor completion in traffic/recommender applications.

Original languageEnglish
Pages (from-to)9856-9871
Number of pages16
JournalProceedings of Machine Learning Research
Volume235
StatePublished - 2024
Event41st International Conference on Machine Learning, ICML 2024 - Vienna, Austria
Duration: 21 Jul 202427 Jul 2024

Fingerprint

Dive into the research topics of 'High-Order Contrastive Learning with Fine-grained Comparative Levels for Sparse Ordinal Tensor Completion'. Together they form a unique fingerprint.

Cite this