inner-banner-bg

Journal of Current Trends in Computer Science Research(JCTCSR)

ISSN: 2836-8495 | DOI: 10.33140/JCTCSR

Impact Factor: 0.98*

Process of Image Super-Resolution

Abstract

Sebastien Lablanche and Gerard Lablanche

In this paper, we explain a process of super-resolution reconstruction allowing to increase the resolution of an image. The need for high-resolution digital images exists in diverse domains, for example, the medical and spatial domains. The obtaining of high-resolution digital images can be made at the time of the shooting, but it is often synonymic with important costs because of the necessary material to avoid such costs, it is known how to use methods of super-resolution reconstruction, consisting of one or several low-resolution images to obtain a high-resolution image. The American patent US 9208537 describes such an algorithm. A zone of one low-resolution image is isolated and categorized according to the information contained in pixels forming the borders of the zone. The category of it zone determines the type of interpolation used to add pixels in the aforementioned zone, to increase the neatness of the images. It is also known how to reconstruct a low- resolution image and a high-resolution image by using a model of super-resolution reconstruction whose learning is based on networks of neurons and on an image or a picture library. The demand of Chinese patent CN 107563965 and the scientist publication “Pixel Recursive Super Resolution “, R. Dahl, M. Norouzi, and J. Shlens propose such methods. The aim of this paper is to demonstrate that it is possible to reconstruct coherent human faces from very degraded pixelated images with a very fast algorithm, faster than the compressed sensing (CS) algorithm, easier to compute and without deep learning, so without important information technology resources, i.e. a large database of thousand of training images ( https://arxiv. org/pdf/2003.13063.pdf). This technological breakthrough has been patented in 2018 with the demand of French patent FR 1855485 (https://patents.google.com/patent/FR3082980A1, see the HAL reference https://hal.archives-ouvertes.fr/hal- 01875898v1).

PDF