Please use this identifier to cite or link to this item: http://hdl.handle.net/2080/5387
Full metadata record
DC FieldValueLanguage
dc.contributor.authorPriyadarshini, Prangya-
dc.contributor.authorChinara, Suchismita-
dc.contributor.authorKumar, Arun-
dc.date.accessioned2025-12-11T10:32:24Z-
dc.date.available2025-12-11T10:32:24Z-
dc.date.issued2025-11-
dc.identifier.citationIEEE Future Networks World Forum (IEEEFNWF), Bangalore, 10-12 November 2025en_US
dc.identifier.urihttp://hdl.handle.net/2080/5387-
dc.descriptionCopyright belongs to the proceeding publisher.en_US
dc.description.abstractThe huge Internet of Things (IoT) that 6G envisions needs transmission systems that are smart, self-sufficient, and long-lasting. It’s hard to meet the strict Quality of Service (QoS) needs of future applications when there is a lot of movement and the network conditions are hard to predict, like Vehicular IoT (V-IoT). This study presents an AI-driven, cross-layer control architecture that uses dynamic aerial edge nodes (UAVs) to provide reliable data distribution. A Recurrent Neural Networkbased Generative Adversarial Network (RNN-GAN) is essentially a prediction engine that forecasts the probability distributions of future link quality and network congestion. These observations from different layers are fed into a Model Predictive Control (MPC) agent at the network layer. This agent chooses routes ahead of time while taking risk into account. Preliminary simulation results show that the suggested strategy improves Packet Delivery Ratio (PDR) by up to 16%, lowers end-to-end latency by over 30% with high traffic, and up to 65% when employing aerial relays.en_US
dc.subjectVANETsen_US
dc.subject6Gen_US
dc.subjectMassive IoTen_US
dc.subjectRNNen_US
dc.titleUncertainty-Aware MPC Routing for UAV-Assisted VANETs using RNN-GANsen_US
dc.typeArticleen_US
Appears in Collections:Conference Papers

Files in This Item:
File Description SizeFormat 
2025_FNWF_AKumar_Uncertainty.pdf11.37 MBAdobe PDFView/Open    Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.