Please use this identifier to cite or link to this item: http://hdl.handle.net/2080/4706
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGhosh, Mainak-
dc.contributor.authorNandy, Anup-
dc.contributor.authorPatra, Bidyut Kr.-
dc.contributor.authorAnitha, R.-
dc.contributor.authorMohanavelu, K-
dc.date.accessioned2024-10-03T11:41:48Z-
dc.date.available2024-10-03T11:41:48Z-
dc.date.issued2024-09-
dc.identifier.citation2024 IEEE Region 10 Symposium (TENSYMP), New Delhi, India, 27-29 September 2024en_US
dc.identifier.urihttp://hdl.handle.net/2080/4706-
dc.descriptionCopyright belongs to proceeding publisheren_US
dc.description.abstractClinical gait analysis plays a vital role in diagnosis and monitoring neurological and musculoskeletal injuries. Qualitative gait assessment depends on subjective observations, manual measurements, and specialized equipment. Recently machine learning and deep learning based models have demonstrated significant accuracy in gait analysis. But dynamic feature extraction is always a challenging problem in temporal gait data analysis. After extracting dynamic features, a Fully-connected Neural Network (FNN) is employed to classify of gait abnormalities using GaitRec standard dataset. The proposed multi-modal features based classification model achieves 96.22% accuracy and it outperforms state-of-the-art methodsen_US
dc.subjectGait classificationen_US
dc.subjectFeature extractionen_US
dc.subjectConvolutional Neural Networken_US
dc.subjectDiscrete Wavelength Transformen_US
dc.subjectGround Reaction Forceen_US
dc.titleMulti-modal Deep Neural Features for Classification of Gait Abnormalityen_US
dc.typeArticleen_US
Appears in Collections:Conference Papers

Files in This Item:
File Description SizeFormat 
2024_TENSYMP_MGhosh_Multi-modal.pdf850.41 kBAdobe PDFView/Open    Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.