Comparison of different color spaces for image segmentation using graph-cut

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

33 Scopus citations

Abstract

Graph-cut optimization has been successfully applied in many image segmentation tasks. Within this framework color information has been extensively used as a perceptual property of objects to segment the foreground object from background. There are different representations of color in digital images, each with special characteristics. Previous work on segmentation lacks a systematic study of which color space is better suited for image segmentation. This work applies the Graph Cut algorithm for image segmentation based on five different, widespread color spaces and evaluates their performance on public benchmark datasets. Most of the tested color spaces lead to similar results. Segmentations based on L*a*b* color space are of slightly higher or similar quality as all the other methods. In contrast, RGB-based segmentations are mostly worse than a segmentation based on any other tested color space.

Original languageEnglish
Title of host publicationVISAPP 2014 - Proceedings of the 9th International Conference on Computer Vision Theory and Applications
PublisherSciTePress
Pages301-308
Number of pages8
ISBN (Print)9789897580031
StatePublished - 2014
Externally publishedYes
Event9th International Conference on Computer Vision Theory and Applications, VISAPP 2014 - Lisbon, Portugal
Duration: 5 Jan 20148 Jan 2014

Publication series

NameVISAPP 2014 - Proceedings of the 9th International Conference on Computer Vision Theory and Applications
Volume1

Conference

Conference9th International Conference on Computer Vision Theory and Applications, VISAPP 2014
Country/TerritoryPortugal
CityLisbon
Period5/01/148/01/14

Keywords

  • Color space
  • Graph-cut
  • Image segmentation

Fingerprint

Dive into the research topics of 'Comparison of different color spaces for image segmentation using graph-cut'. Together they form a unique fingerprint.

Cite this