Art style classification via self-supervised dual-teacher knowledge distillation

Mei Luo, Li Liu, Yue Lu, Ching Y. Suen

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Art style classification plays a crucial role in computational aesthetics. Traditional deep learning-based methods for art style classification typically require a large number of labeled images, which are scarce in the art domain. To address this challenge, we propose a self-supervised learning method specifically tailored for art style classification. Our method effectively learns image style features using unlabeled images. Specifically, we introduce a novel self-supervised learning approach based on the popular contrastive learning framework, incorporating a unique dual-teacher knowledge distillation technique. The two teacher networks provide complementary guidance to the student network. Each teacher network focuses on extracting distinct features, offering diverse perspectives. This collaborative guidance enables the student network to learn detailed and robust representations of art style attributes. Furthermore, recognizing the Gram matrix's capability to capture image style through feature correlations, we explicitly integrate it into our self-supervised learning framework. We propose a relation alignment loss to train the network, leveraging image relationships. This loss function has shown promising results compared to the commonly used InfoNCE loss. To validate our proposed method, we conducted extensive experiments on three publicly available datasets: WikiArt, Pandora18k, and Flickr. The experimental results demonstrate the superiority of our method, significantly outperforming state-of-the-art self-supervised learning methods. Additionally, when compared with supervised methods, our approach shows competitive results, notably surpassing supervised learning methods on the Flickr dataset. Ablation experiments further verify the efficacy of each component of our proposed network. The code is publicly available at: https://github.com/lm-oc/dual_signal_gram_matrix.

Original languageEnglish
Article number112964
JournalApplied Soft Computing
Volume174
DOIs
StatePublished - Apr 2025

Keywords

  • Art style classification
  • Dual-teacher knowledge distillation
  • Gram matrix
  • Relation alignment loss
  • Self-supervised contrastive learning

Fingerprint

Dive into the research topics of 'Art style classification via self-supervised dual-teacher knowledge distillation'. Together they form a unique fingerprint.

Cite this