Moving Object Detection Using Tensor-Based Low-Rank and Saliently Fused-Sparse Decomposition

Wenrui Hu, Yehui Yang, Wensheng Zhang, Yuan Xie

Research output: Contribution to journalArticlepeer-review

67 Scopus citations

Abstract

In this paper, we propose a new low-rank and sparse representation model for moving object detection. The model preserves the natural space-time structure of video sequences by representing them as three-way tensors. Then, it operates the low-rank background and sparse foreground decomposition in the tensor framework. On the one hand, we use the tensor nuclear norm to exploit the spatio-temporal redundancy of background based on the circulant algebra. On the other, we use the new designed saliently fused-sparse regularizer (SFS) to adaptively constrain the foreground with spatio-temporal smoothness. To refine the existing foreground smooth regularizers, the SFS incorporates the local spatio-temporal geometric structure information into the tensor total variation by using the 3D locally adaptive regression kernel (3D-LARK). What is more, the SFS further uses the 3D-LARK to compute the space-time motion saliency of foreground, which is combined with the L1 norm and improves the robustness of foreground extraction. Finally, we solve the proposed model with globally optimal guarantee. Extensive experiments on challenging well-known data sets demonstrate that our method significantly outperforms the state-of-the-art approaches and works effectively on a wide range of complex scenarios.

Original languageEnglish
Article number7740902
Pages (from-to)724-737
Number of pages14
JournalIEEE Transactions on Image Processing
Volume26
Issue number2
DOIs
StatePublished - Feb 2017
Externally publishedYes

Keywords

  • Moving object detection
  • Space-time visual saliency
  • tensor nuclear norm
  • tensor total variation

Fingerprint

Dive into the research topics of 'Moving Object Detection Using Tensor-Based Low-Rank and Saliently Fused-Sparse Decomposition'. Together they form a unique fingerprint.

Cite this