Please use this identifier to cite or link to this item: http://hdl.handle.net/2080/3961
Title: A Dense Network Based Framework for the Fusion of Infrared and Visible Images with Edge Extraction
Authors: Banerjee, Ankan
Patra, Dipti
Roy, Pradipta
Keywords: Image fusion
Deep learning
Dense network
Infrared images
Visible images
Visual perception
Issue Date: Feb-2023
Citation: 3rd IEEE International Conference On Range Technology(ICORT 2023), Chandipur, India, 23 To 25 February 2023
Abstract: Image fusion of the infrared (IR) and visible images has become one of the essential pre-processing steps for image analysis in medical, defense, remote sensing, etc. Till now, the spatial and transform domain methods have produced fused images that are not of good quality. Also, deep learning-based techniques are mostly concerned with feature extraction from the source images before fusion, but the distinct features of the source images have been overlooked. In the proposed model an attempt has been made to extract the edges of the IR image followed by the feature extraction of the IR and visible images using a densenetwork based convolutional neural network. The feature maps from the two pipelines are concatenated along with the edgeextracted IR image and the enhanced visible image to obtain the final fused image. The proposed model has been tested on the TNO image dataset. The quantitative and qualitative results show that the proposed method outperformed when compared with the state-of-the-art method in the entropy content and also produces fused images with better human perception quality
Description: Copyright belongs to proceeding publisher
URI: http://hdl.handle.net/2080/3961
Appears in Collections:Conference Papers

Files in This Item:
File Description SizeFormat 
2023_ICORT_ABanerjee_ADense.pdf1.56 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.