Please use this identifier to cite or link to this item: http://hdl.handle.net/2080/4009
Title: A Robust Object Recognition Using Modified YOLOv5 Neural Network
Authors: Balmik, Archana
Barik, Subhasish
Nandy, Anup
Keywords: Computer Vision
Object Recognition
Object Localization
Feature Extraction
YOLOv5
Issue Date: Mar-2023
Citation: International Conference on Signal Processing and Integrated Networks(SPIN), Amity University, Noida, Delhi-NCR, India, 23-24 March 2023
Abstract: Human beings can recognize and detect the object in any image or video with ease, but when it comes to machines dealing with the same task, it’s quite challenging. In the past years, there has been significant progress in the area of computer vision. However, researchers are still working on enhancing the performance of object recognition algorithms in terms of accuracy and speed. This paper presents one of the greatest CNN representatives, You Only Look Once (YOLO), which tackles object detection in the simplest and most efficient way. This study primarily focuses on object recognition with modified YOLOv5. Here, we have implemented the model on a dataset comprising five objects. We have analyzed various preprocessing and feature extraction techniques to enhance the performance of the proposed model. The highest mean average precision (mAP) achieved by the modified YOLOv5 is 90.7%. The recognition performance of the modified YOLOv5 is compared with the YOLOv4 algorithm. It is also compared with various conventional object detection algorithms outside the YOLO family, such as Support vector machine (SVM), deep Convolutional neural network (CNN), and EfficientNet. The results proved to be promising as compared to the other state-of-the-art techniques.
Description: Copyright belongs to proceeding publisher
URI: http://hdl.handle.net/2080/4009
Appears in Collections:Conference Papers

Files in This Item:
File Description SizeFormat 
2023_SPIN_ABalmik_ARobust.pdf3.37 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.