Abstract:
Vehicle Edge Computing (VEC) leverages compact cloud computing at the mobile network edge to meet the processing and latency needs of vehicles. By bringing computation cl...Show MoreMetadata
Abstract:
Vehicle Edge Computing (VEC) leverages compact cloud computing at the mobile network edge to meet the processing and latency needs of vehicles. By bringing computation closer to the vehicles, VEC reduces data transmission, minimizes latency, and boosts performance for compute-intensive applications. However, during peak hours of urban road traffic, the scarce computational resources available at edge servers could pose challenges in fulfilling the processing needs of vehicles. Introducing Unmanned Aerial Vehicles (UAVs) as supplementary edge computing nodes could significantly mitigate the aforementioned issue. In this paper, we propose a flexible edge computing framework in which a fleet of UAVs function as mobile computational service providers, offering computation offloading services to multiple vehicles. We design and optimize a computation offloading model for the UAV-enabled vehicle edge computing environment. The proposed model tackles the task offloading challenge, aiming to optimize UAV revenue and task processing efficiency while considering the constraints of UAVs’ restricted computational power and energy resources. Towards this end, our model jointly considers two key factors: task partitioning and computational resource allocation. To tackle the challenges posed by the aforementioned non-convex optimization problem, we construct a Markov Decision Process (MDP) model for the multi-UAV-enabled mobile edge computing system and introduce an innovative Multi-Agent Deep Reinforcement Learning (MADRL) framework addressing the decision-making challenge represented by MDP model. Comprehensive simulation outcomes illustrate that our devised task offloading technique outperforms other optimization methods.
Published in: IEEE Internet of Things Journal ( Early Access )