Farewell to CycleGAN: Single GAN with decoupled constraint for unpaired image dehazing

Xiaotong Luo, Wenjin Yang, Yuan Xie, Yanyun Qu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Unpaired image dehazing has attracted more and more attention, since the pair-wise training data which is prerequisite for the supervised dehazing methods leads to high cost if they are really captured or performance degradation on the real-hazy scenes if they are synthesized. The existing methods for unpaired image dehazing are all based on the CycleGAN-like framework with pixel-to-pixel constraint, which leads to burdensome model complexity and unstable training. In this paper, we propose a novel single GAN model for unpaired image dehazing (SinGAN-Dehaze), which gets rid of the cycle-consistency constraint. To be specific, the cycle-consistency is decoupled to content-consistency and style-consistency, where the pixel-to-pixel mapping is replaced by the patch-to-patch semantic mapping. The content-consistency is ensured by capturing local distinctive representations and global contextual dependencies. The style-consistency is achieved by forcing the high-frequency information distribution of dehazing result close to that of the clear image with similar style. Extensive experiments demonstrate that our proposal can achieve superior performance for unpaired image dehazing in terms of the objective index and visual effect on both synthetic and real-hazy scenarios.

Original languageEnglish
Article number129888
JournalNeurocomputing
Volume636
DOIs
StatePublished - 1 Jul 2025

Keywords

  • Contrastive learning
  • Disentangled representation learning
  • Style transfer
  • Unpaired image dehazing

Fingerprint

Dive into the research topics of 'Farewell to CycleGAN: Single GAN with decoupled constraint for unpaired image dehazing'. Together they form a unique fingerprint.

Cite this