Please use this identifier to cite or link to this item:
http://hdl.handle.net/2080/3462
Title: | Deep Learning based Loitering Detection System using Multi-camera Video Surveillance Network |
Authors: | Nayak, Rashmiranjan Behera, Mohini Mohan Girish, V Pati, Umesh Chandra Das, Santos Kumar |
Keywords: | Deep learning Deep-SORT algorithm loitering detection system MobileNets Triplet loss Smart city YOLOv3 |
Issue Date: | Dec-2019 |
Citation: | IEEE International Symposium on Smart Electronic Systems (iSES), NIT Rourkela, Odisha, India, 16-18 December, 2019 |
Abstract: | A deep-learning-based Loitering Detection System (LDS) with re-identification (ReID) capability over a multicamera network is proposed. The proposed LDS is mainly comprising of object detection and tracking, loitering detection, feature extraction, camera switching, and re-identification of the loiterer. The person is detected using You Only Look Once (YOLOv3) and tracked using Simple Online Real-time Tracking with a deep as-sociation matrix (DeepSORT). From the trajectory analysis, once the time and displacements thresholds are satisfied, the person is treated as a loiterer. When the loiterer moves one camera to another, then the algorithm is switched to the appropriate camera feed as per the proposed camera switching algorithm to minimize the computational cost. Subsequently, the loiterer is re-identified in the switched camera feed by comparing the features of the loiterer extracted by the MobileNets with that of the other detected persons based on the triplet loss criteria. The proposed system provides an enhanced accuracy of 96 % on average fps of 33 (without ReID) and 81.5 % at average fps of 30 (with ReID). |
Description: | Copyright belongs to proceeding publisher |
URI: | http://hdl.handle.net/2080/3462 |
Appears in Collections: | Conference Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
2019_IEE-iSES_RNaik_Deep.pdf | 11.55 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.