Please use this identifier to cite or link to this item:
Title: Occlusion prediction algorithms for multi-camera network
Authors: Raman, R
Sa, Pankaj K
Majhi, B
Keywords: Occlusion prediction algorithm
Multi-camera network
Perspective view analysis
Issue Date: Nov-2012
Citation: ICDSC 2012: 6th ACM/IEEE International Conference on Distributed Smart Cameras Oct 30 - Nov 2, 2012, Hong Kong
Abstract: The mode of object tracking has evolved from single camera tracking to multi-camera tracking over the last few years. Even though multi-camera model overcomes limitations present in single camera system, it introduces complexities of handling network of multiple cameras. In this paper, we propose a novel real time occlusion prediction method for multi-camera network thus reducing complexities and cost of tracking in multi-camera model without losing track of the subject. The proposed method constitutes sequential phases as (a) estimation of direction of relative motion in real plane based on change in bounding pixel positions of tracked subject, (b) constructing algorithms for occlusion prediction, and (c) mitigation of occlusion by awaking a minimal subset of cameras in network. The method uses the pattern of change in the dimensions of bounding box with respect to frame number, obtained by applying background ubtraction to the video of mutual motion of the subjects. On the basis of estimated direction of motion it uses proposed algorithms to decide the possibility and proximity of occlusion and thus awaking the minimal set of camera in the network that does not encounter occlusion. The proposed approach has been verified using various video samples and is observed that the proposed method successfully predicts the occurrence of occlusion.
Description: Copyright for this paper belongs to IEEE
Appears in Collections:Conference Papers

Files in This Item:
File Description SizeFormat 
RamanSaMajhi.pdf284.61 kBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.