Please use this identifier to cite or link to this item: http://hdl.handle.net/2080/5457
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSahoo, Arupananda-
dc.contributor.authorGhosh, Mainak-
dc.contributor.authorNandy, Anup-
dc.date.accessioned2025-12-26T11:51:47Z-
dc.date.available2025-12-26T11:51:47Z-
dc.date.issued2025-12-
dc.identifier.citation11th International Conference on Pattern Recognition and Machine Intelligence (PReMI), IIT, Delhi, 11-14 December 2025en_US
dc.identifier.urihttp://hdl.handle.net/2080/5457-
dc.descriptionCopyright belongs to the proceeding publisher.en_US
dc.description.abstractGait abnormality classification of neurodegenerative disorders plays a vital role in early diagnosis and clinical decision-making. Traditional machine learning approaches often rely on handcrafted features and fail to capture the complex temporal and spatial dynamics present in gait signals, limiting their classification performance. To overcome these, we present a hybrid deep learning model combining Convolutional Neural Networks (CNN), Bidirectional Long Short-Term Memory (BiLSTM), and Gated Recurrent Units (GRU) for classifying gait patterns associated with neurodegenerative diseases using the PhysioNet GaitNDD dataset. The model achieves high classification accuracies of 97.14% for NDD vs HC, 98.06% for ALS vs HC, 98.14% for HD vs HC, and 96.22% for PD vs HC, outperforming existing methods. A detailed explainability analysis using SHAP highlights the importance of features such as sample entropy and stride interval, confirming the model’s focus on clinically relevant patterns.en_US
dc.subjectCNN-Bilstm-GRUen_US
dc.subjectGaitNDDen_US
dc.subjectExplainable AIen_US
dc.subjectGait Classificationen_US
dc.titleHybrid Neural Model for Classification of Neurodegenerative Gait Disordersen_US
dc.typeArticleen_US
Appears in Collections:Conference Papers

Files in This Item:
File Description SizeFormat 
2025_PReMI_ASahoo_Hybrid.pdf644.56 kBAdobe PDFView/Open    Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.