RANSP: Ranking attention network for saliency prediction on omnidirectional images

Dandan Zhu, Yongqing Chen, Xiongkuo Min, Yucheng Zhu, Guokai Zhang, Qiangqiang Zhou, Guangtao Zhai, Xiaokang Yang

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Various convolutional neural network (CNN)-based methods have shown the ability to boost the performance of saliency prediction on omnidirectional images (ODIs). However, these methods are limited by sub-optimal accuracy, because not all the features extracted by the CNN model are useful for the final fine-grained saliency prediction. Some features are redundant and may have negative impact on the final fine-grained saliency prediction. To tackle this problem, we propose a novel Ranking Attention Network for saliency prediction (RANSP) of head fixations on ODIs. Specifically, the part-guided attention (PA) module and channel-wise feature (CF) extraction module are integrated in a unified framework and are trained in an end-to-end manner for fine-grained saliency prediction. To better utilize the channel-wise feature maps, we further propose a new Ranking Attention Module (RAM), which automatically ranks and selects these feature maps based on scores for fine-grained saliency prediction. Extensive experiments and ablation studies are conducted to show the effectiveness of our method for saliency prediction on ODIs.

Original languageEnglish
Pages (from-to)118-128
Number of pages11
JournalNeurocomputing
Volume461
DOIs
StatePublished - 21 Oct 2021
Externally publishedYes

Keywords

  • Channel-wise feature maps
  • Omnidirectional images
  • Part-guided attention
  • Ranking attention
  • Saliency prediction

Fingerprint

Dive into the research topics of 'RANSP: Ranking attention network for saliency prediction on omnidirectional images'. Together they form a unique fingerprint.

Cite this