An efficient method of dynamic texture tracking based on increment evolution

  • Hongyan Quan*
  • , Changbo Wang
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Dynamic texture tracking is a hot spot problem of computer vision, video surveillance and other areas. This paper presents a dynamic texture real-time tracking method .We first put forward a dynamic texture segmentation method based on level set based. In the method we define the evolution function, use the re-initialization condition to punish in the evolution process and use the optical flow based terminal condition to limit the evolution in order to make the level set more efficient. It overcomes the shortcoming of reinitializing and slow evolution speed problem of level set in the existing methods. In the study of dynamic texture tracking, in order to realize real-time tracking, we use the relevance of adjacent frames to evolution. Experimental results show that the method of real-time tracking of dynamic texture is a very effective.

Original languageEnglish
Title of host publicationProceedings - 2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009
DOIs
StatePublished - 2009
Event2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009 - Wuhan, China
Duration: 11 Dec 200913 Dec 2009

Publication series

NameProceedings - 2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009

Conference

Conference2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009
Country/TerritoryChina
CityWuhan
Period11/12/0913/12/09

Keywords

  • Dynamic texture
  • Level set
  • Optical flow
  • Region segmentation
  • Tracking

Fingerprint

Dive into the research topics of 'An efficient method of dynamic texture tracking based on increment evolution'. Together they form a unique fingerprint.

Cite this