TY - JOUR
T1 - Art style classification via self-supervised dual-teacher knowledge distillation
AU - Luo, Mei
AU - Liu, Li
AU - Lu, Yue
AU - Suen, Ching Y.
N1 - Publisher Copyright:
© 2025
PY - 2025/4
Y1 - 2025/4
N2 - Art style classification plays a crucial role in computational aesthetics. Traditional deep learning-based methods for art style classification typically require a large number of labeled images, which are scarce in the art domain. To address this challenge, we propose a self-supervised learning method specifically tailored for art style classification. Our method effectively learns image style features using unlabeled images. Specifically, we introduce a novel self-supervised learning approach based on the popular contrastive learning framework, incorporating a unique dual-teacher knowledge distillation technique. The two teacher networks provide complementary guidance to the student network. Each teacher network focuses on extracting distinct features, offering diverse perspectives. This collaborative guidance enables the student network to learn detailed and robust representations of art style attributes. Furthermore, recognizing the Gram matrix's capability to capture image style through feature correlations, we explicitly integrate it into our self-supervised learning framework. We propose a relation alignment loss to train the network, leveraging image relationships. This loss function has shown promising results compared to the commonly used InfoNCE loss. To validate our proposed method, we conducted extensive experiments on three publicly available datasets: WikiArt, Pandora18k, and Flickr. The experimental results demonstrate the superiority of our method, significantly outperforming state-of-the-art self-supervised learning methods. Additionally, when compared with supervised methods, our approach shows competitive results, notably surpassing supervised learning methods on the Flickr dataset. Ablation experiments further verify the efficacy of each component of our proposed network. The code is publicly available at: https://github.com/lm-oc/dual_signal_gram_matrix.
AB - Art style classification plays a crucial role in computational aesthetics. Traditional deep learning-based methods for art style classification typically require a large number of labeled images, which are scarce in the art domain. To address this challenge, we propose a self-supervised learning method specifically tailored for art style classification. Our method effectively learns image style features using unlabeled images. Specifically, we introduce a novel self-supervised learning approach based on the popular contrastive learning framework, incorporating a unique dual-teacher knowledge distillation technique. The two teacher networks provide complementary guidance to the student network. Each teacher network focuses on extracting distinct features, offering diverse perspectives. This collaborative guidance enables the student network to learn detailed and robust representations of art style attributes. Furthermore, recognizing the Gram matrix's capability to capture image style through feature correlations, we explicitly integrate it into our self-supervised learning framework. We propose a relation alignment loss to train the network, leveraging image relationships. This loss function has shown promising results compared to the commonly used InfoNCE loss. To validate our proposed method, we conducted extensive experiments on three publicly available datasets: WikiArt, Pandora18k, and Flickr. The experimental results demonstrate the superiority of our method, significantly outperforming state-of-the-art self-supervised learning methods. Additionally, when compared with supervised methods, our approach shows competitive results, notably surpassing supervised learning methods on the Flickr dataset. Ablation experiments further verify the efficacy of each component of our proposed network. The code is publicly available at: https://github.com/lm-oc/dual_signal_gram_matrix.
KW - Art style classification
KW - Dual-teacher knowledge distillation
KW - Gram matrix
KW - Relation alignment loss
KW - Self-supervised contrastive learning
UR - https://www.scopus.com/pages/publications/86000488916
U2 - 10.1016/j.asoc.2025.112964
DO - 10.1016/j.asoc.2025.112964
M3 - 文章
AN - SCOPUS:86000488916
SN - 1568-4946
VL - 174
JO - Applied Soft Computing
JF - Applied Soft Computing
M1 - 112964
ER -