Please use this identifier to cite or link to this item: http://hdl.handle.net/2080/1902
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDas, S-
dc.contributor.authorMeher, S-
dc.date.accessioned2013-04-01T12:53:42Z-
dc.date.available2013-04-01T12:53:42Z-
dc.date.issued2013-03-
dc.identifier.citationInternational conference on Advancement in Information Technology which was held on 22-23 rd march,2013 at Poornima Institute of Technology ,Jaipur.en
dc.identifier.urihttp://hdl.handle.net/2080/1902-
dc.descriptionCopyright belongs to proceeding publisheren
dc.description.abstractHuman gait is a new biometric resource in visual surveillance system. It can recognize individual as the way they walk. In the walking process, the human body shows regular periodic variation, such as upper and lower limbs, knee point, thigh point, height, etc. which reflects the individual’s unique movement pattern. However from a computational perspective, it is quite difficult to extract some feature points (knee, thigh, leg, and hip) because of occlusion of clothes, carrying bags. Height is one of the important features from the several gait features which is not influenced by the camera erformance, distance and clothing style of the subject. This paper proposes DLT method of predicting height variation signal from the gait cycle of each subject. Height estimation has done using calibrated camera images. The variation of height signal is further analyzed using various transform: DHT, DFT, and DCT. Euclidian distance and MSE are computed on feature vectors to recognize individual.en
dc.format.extent622243 bytes-
dc.format.mimetypeapplication/pdf-
dc.language.isoen-
dc.subjectGait Recognitionen
dc.subjectsilhouette Detectionen
dc.subjectCamera Calibrationen
dc.subjectHeight Measurementen
dc.titleAutomatic Extraction of Human Gait Feature for Recognitionen
dc.typeArticleen
Appears in Collections:Conference Papers

Files in This Item:
File Description SizeFormat 
full paper.pdf607.66 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.