CPSAM: Channel and Position Squeeze Attention Module

  • Yuchen Gong
  • , Zhihao Gu*
  • , Zhenghao Zhang
  • , Lizhuang Ma
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

In deep neural networks, how to model the remote dependency on time or space has always been a problem for scholars. By aggregatingpioneering method of capturing remote dependencies. However, the NL network faces many problems; 1) For different query positions in the image, the long-range dependency modeled by the NL network is quite similar so that it’s a wates of computation cost to build pixel-level pairwise relations. 2) The NL network only focuses on capturing spatial-wise lo a ng-range dependencies and neglects channel-wise attention. Therefore, in response to thesquery-specific global context of each query location, Non-Local (NL) networks propose e problems, we propose the Channel and Position Squeeze Attention Module (CPSAM). Specifically, for a feature map of the middle layer, our module infers attention maps along channel and spatial dimensions in parallel. The Channel Squeeze Attention Module selectively joins the feature of different position by a query-independent feature map. Meanwhile, the Position Squeeze Attention Module uses both avg and max pooling to compress the spatial dimension and Integrate the correlation characteristics between all channel maps. Finally, the outputs of two attention modules are combine together through the conv layer to further enhance feature representation. We have achieved higher accuracy and fewer parameters on the cifar100 and ImageNet1k compared to the NL network. The code will be publicly available soon.

Original languageEnglish
Title of host publicationNeural Information Processing - 28th International Conference, ICONIP 2021, Proceedings
EditorsTeddy Mantoro, Minho Lee, Media Anugerah Ayu, Kok Wai Wong, Achmad Nizar Hidayanto
PublisherSpringer Science and Business Media Deutschland GmbH
Pages190-202
Number of pages13
ISBN (Print)9783030921842
DOIs
StatePublished - 2021
Event28th International Conference on Neural Information Processing, ICONIP 2021 - Virtual, Online
Duration: 8 Dec 202112 Dec 2021

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume13108 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference28th International Conference on Neural Information Processing, ICONIP 2021
CityVirtual, Online
Period8/12/2112/12/21

Keywords

  • Attention mechanism
  • Image classification
  • Non-local network

Fingerprint

Dive into the research topics of 'CPSAM: Channel and Position Squeeze Attention Module'. Together they form a unique fingerprint.

Cite this