Abstract:
Content Caching at the edge of vehicular networks has been considered as a promising technology to satisfy the increasing demands of computation-intensive and latency-sen...Show MoreMetadata
Abstract:
Content Caching at the edge of vehicular networks has been considered as a promising technology to satisfy the increasing demands of computation-intensive and latency-sensitive vehicular applications for intelligent transportation. The existing content caching schemes, when used in vehicular networks, face two distinct challenges: 1) Vehicles connected to an edge server keep moving, making the content popularity varying and hard to predict. 2) Cached content is easily out-of-date since each connected vehicle stays in the area of an edge server for a short duration. To address these challenges, we propose a Mobility-aware Proactive edge Caching scheme based on Federated learning (MPCF). This new scheme enables multiple vehicles to collaboratively learn a global model for predicting content popularity with the private training data distributed on local vehicles. MPCF also employs a Context-aware Adversarial AutoEncoder to predict the highly dynamic content popularity. Besides, MPCF integrates a mobility-aware cache replacement policy, which allows the network edges to add/evict contents in response to the mobility patterns and preferences of vehicles. MPCF can greatly improve cache performance, effectively protect users' privacy and significantly reduce communication costs. Experimental results demonstrate that MPCF outperforms other baseline caching schemes in terms of the cache hit ratio in vehicular edge networks.
Published in: IEEE Transactions on Intelligent Transportation Systems ( Volume: 22, Issue: 8, August 2021)
Funding Agency:
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Federated Learning ,
- Edge Caching ,
- Proactive Edge Caching ,
- Privacy ,
- Global Model ,
- Communication Cost ,
- Edge Server ,
- Multiple Vehicles ,
- Vehicular Networks ,
- Content Popularity ,
- Content Caching ,
- Caching Scheme ,
- Cache Hit ,
- Deep Learning ,
- Local Data ,
- Generative Adversarial Networks ,
- Base Station ,
- Updated Model ,
- Variational Autoencoder ,
- Local Training ,
- Roadside Units ,
- Content Request ,
- Communication Rounds ,
- Mobile Edge Computing ,
- Edge Nodes ,
- Latent Code ,
- Longer Training Time ,
- Cache Misses ,
- High-quality Models ,
- Mobile Vehicles
- Author Keywords
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Federated Learning ,
- Edge Caching ,
- Proactive Edge Caching ,
- Privacy ,
- Global Model ,
- Communication Cost ,
- Edge Server ,
- Multiple Vehicles ,
- Vehicular Networks ,
- Content Popularity ,
- Content Caching ,
- Caching Scheme ,
- Cache Hit ,
- Deep Learning ,
- Local Data ,
- Generative Adversarial Networks ,
- Base Station ,
- Updated Model ,
- Variational Autoencoder ,
- Local Training ,
- Roadside Units ,
- Content Request ,
- Communication Rounds ,
- Mobile Edge Computing ,
- Edge Nodes ,
- Latent Code ,
- Longer Training Time ,
- Cache Misses ,
- High-quality Models ,
- Mobile Vehicles
- Author Keywords