Please use this identifier to cite or link to this item:
http://hdl.handle.net/2080/3871
Title: | A DEEP LEARNING FRAMEWORK FOR PREDICTING SIGNALS IN OFDM-NOMA WITH VARIOUS ALGORITHMS |
Authors: | Panda, Bibekananda Singh, Poonam |
Keywords: | NOMA DNN GRU LSTM, Bi-LSTM |
Issue Date: | Dec-2022 |
Citation: | International Conference on Machine Learning, AI and Education(MLAEDU2022), Dubai, 17-18 Dec 2022 |
Abstract: | The non-orthogonal multiple access (NOMA) approaches have increasingly attracted much interest. It has also been a potential method for wireless communication systems beyond the fifth generation (5G). The successive interference cancellation (SIC) procedure in NOMA systems is often carried out at the receiver, where several users are sequentially decoded. The successful detection of prior users will significantly influence the detection accuracy due to the effects of interferences. A deep learning-based NOMA receiver is analyzed to detect signals for multiple users in a single application without determining channels. This paper analyzes deep learning (DL)- based receiver for NOMA signal detection concerning several DL-aided sequence layers-based algorithms and optimizers by training orthogonal frequency division multiplexing (OFDM) symbols. The simulation outcomes illustrate the various DL-based receiver characteristics using the traditional SIC approach. It also demonstrates that the effect of the different DL-based models is more predictable than the SIC approach. |
Description: | Copyright belongs to proceeding publisher |
URI: | http://hdl.handle.net/2080/3871 |
Appears in Collections: | Conference Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Singhp_-MLAEDU2022.pdf | 843.83 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.