Example-based image upscaling using parallel texture synthesis

Rugang Zheng, Bin Sheng, Lizhuang Ma

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

This paper presents an example-based superresolution image upscale method which extends the current framework. In our approach, we rely on several external example images to represent different scales of the original image. We perform a parallel texture synthesis to reconstruct the miss information in the target image by upsampling. After that, we use correction to make the result more matching to the original low-resolution image. What's more, since the self-loop of example image is allowed, we can create infinitely magnified image which is not practical for current upscale schemas. The abstract is to be in fully-justified italicized text, at the top of the left-hand column as it is here, below the author information. Use the word "Abstract" as the title, in 12-point Times, boldface type, centered relative to the column, initially capitalized. The abstract is to be in 10-point, single-spaced type, and up to 150 words in length. Leave two blank lines after the abstract, then begin the main text.

Original languageEnglish
Title of host publicationICALIP 2012 - 2012 International Conference on Audio, Language and Image Processing, Proceedings
Pages710-715
Number of pages6
DOIs
StatePublished - 2012
Externally publishedYes
Event2012 3rd IEEE/IET International Conference on Audio, Language and Image Processing, ICALIP 2012 - Shanghai, China
Duration: 16 Jul 201218 Jul 2012

Publication series

NameICALIP 2012 - 2012 International Conference on Audio, Language and Image Processing, Proceedings

Conference

Conference2012 3rd IEEE/IET International Conference on Audio, Language and Image Processing, ICALIP 2012
Country/TerritoryChina
CityShanghai
Period16/07/1218/07/12

Fingerprint

Dive into the research topics of 'Example-based image upscaling using parallel texture synthesis'. Together they form a unique fingerprint.

Cite this