Please use this identifier to cite or link to this item: http://hdl.handle.net/2080/3768
Full metadata record
DC FieldValueLanguage
dc.contributor.authorPanda, Bibekananda-
dc.contributor.authorSingh, Poonam-
dc.date.accessioned2022-11-21T10:48:25Z-
dc.date.available2022-11-21T10:48:25Z-
dc.date.issued2022-11-
dc.identifier.citationIEEE conference ODICON-2022 to be held during 11th -12th November 2022, Siksha 'O' Anusandhan (Deemed to be University) Bhubaneswar, Odisha, Indiaen_US
dc.identifier.urihttp://hdl.handle.net/2080/3768-
dc.descriptionCopyright belongs to proceeding publisheren_US
dc.description.abstractDeep and machine learning-based algorithms are two new methodologies to solve time series prediction challenges. Traditional regression-based modeling has been found to provide less accurate findings than these techniques. A deep learning- assisted method for detecting signals in non-orthogonal multiple access systems with orthogonal frequency division (OFDM- NOMA) is described. The deep neural network (DNN) with a bi-directional long short-term memory (Bi-LSTM) is used to detect signals using different deep learning-based optimizers such as Sgdm, RMSprop, and Adam. The combination that detects most accurately is determined by comparing neural networks and optimizers. The simulations show that the deep learning technique can outperform the conventional successive interference cancellation (SIC) method and demonstrate that the Bi-LSTM-based deep learning algorithm may effectively detect signals in NOMA system scenarios in the Long-Short-Term Memory (LSTM) model. As a result, deep learning is a reliable and essential method for detecting NOMA signals.en_US
dc.subjectNOMAen_US
dc.subjectLSTMen_US
dc.subjectBi-LSTMen_US
dc.subjectOptimizersen_US
dc.titleSignal Detection in NOMA Systems using DNN with Bidirectional LSTMen_US
Appears in Collections:Conference Papers

Files in This Item:
File Description SizeFormat 
PandaB_ODICON 2022.pdf502.47 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.