Please use this identifier to cite or link to this item: http://hdl.handle.net/2080/3624
Title: Iterative Dense Network using Laplacian Pyramid Model for Blind Super Resolution
Authors: Deepak, Shashikant
Singh, Chanda
Patra, Dipti
Keywords: Blind super resolution
Iterative network
Debluring
kernel estimation
Issue Date: Dec-2021
Citation: INDICON 2021 on 19-21 December, 2021 at Guwahati, India (virtually).
Abstract: The Deep learning-based approach for solving the ill-posed problem of single image super-resolution reconstruction (SRR) has achieved tremendous success in recent times. However, not much work is carried out in the direction of blind image super-resolution, where the degradation kernel is said to be unknown. This paper addresses the said problem of blind single image super-resolution reconstruction using an alternative learning approach by training two convolutional neural networks. Most of the available model for blind super-resolution considers a fixed degradation kernel for reconstruction, which leads to drop in performance. Therefore a learnable kernel estimation approach is adopted by using a kernel-estimator network. Further, this estimated kernel is used to generate a super resolution image using a Generator network. To successfully model the reconstruction of vital features like edges and texture and to learn the inter-pixel dependencies between multi-level feature maps, we employ a densely residual Laplacian attention block (DLA-Block). The proposed method is extensively tested on real image and synthetic image data-sets. The experimental results have shown out-performance compared to the state-of the- art in terms of high reconstruction accuracy as well as PSNR and SSIM.
Description: Copyright of this paper is with proceedings publisher
URI: http://hdl.handle.net/2080/3624
Appears in Collections:Conference Papers

Files in This Item:
File Description SizeFormat 
PATRA,D_IEEE INDICON2021.pdf4.42 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.