Multilevel Edge Features Guided Network for Image Denoising

Faming Fang, Juncheng Li, Yiting Yuan, Tieyong Zeng, Guixu Zhang

Research output: Contribution to journalArticlepeer-review

74 Scopus citations

Abstract

Image denoising is a challenging inverse problem due to complex scenes and information loss. Recently, various methods have been considered to solve this problem by building a well-designed convolutional neural network (CNN) or introducing some hand-designed image priors. Different from previous works, we investigate a new framework for image denoising, which integrates edge detection, edge guidance, and image denoising into an end-to-end CNN model. To achieve this goal, we propose a multilevel edge features guided network (MLEFGN). First, we build an edge reconstruction network (Edge-Net) to directly predict clear edges from the noisy image. Then, the Edge-Net is embedded as part of the model to provide edge priors, and a dual-path network is applied to extract the image and edge features, respectively. Finally, we introduce a multilevel edge features guidance mechanism for image denoising. To the best of our knowledge, the Edge-Net is the first CNN model specially designed to reconstruct image edges from the noisy image, which shows good accuracy and robustness on natural images. Extensive experiments clearly illustrate that our MLEFGN achieves favorable performance against other methods and plenty of ablation studies demonstrate the effectiveness of our proposed Edge-Net and MLEFGN. The code is available at https://github.com/MIVRC/MLEFGN-PyTorch.

Original languageEnglish
Article number9178433
Pages (from-to)3956-3970
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume32
Issue number9
DOIs
StatePublished - Sep 2021

Keywords

  • Edge guidance
  • edge reconstruction network (Edge-Net)
  • image denoising

Fingerprint

Dive into the research topics of 'Multilevel Edge Features Guided Network for Image Denoising'. Together they form a unique fingerprint.

Cite this