Abstract:
Fog radio access networks (F-RANs) are seen as potential architectures to support services of Internet of Things by leveraging edge caching and edge computing. However, c...Show MoreMetadata
Abstract:
Fog radio access networks (F-RANs) are seen as potential architectures to support services of Internet of Things by leveraging edge caching and edge computing. However, current works studying resource management in F-RANs mainly consider a static system with only one communication mode. Given network dynamics, resource diversity, and the coupling of resource management with mode selection, resource management in F-RANs becomes very challenging. Motivated by the recent development of artificial intelligence, a deep reinforcement learning (DRL)-based joint mode selection and resource management approach is proposed. Each user equipment (UE) can operate either in cloud RAN (C-RAN) mode or in device-to-device mode, and the resource managed includes both radio resource and computing resource. The core idea is that the network controller makes intelligent decisions on UE communication modes and processors' on-off states with precoding for UEs in C-RAN mode optimized subsequently, aiming at minimizing long-term system power consumption under the dynamics of edge cache states. By simulations, the impacts of several parameters, such as learning rate and edge caching service capability, on system performance are demonstrated, and meanwhile the proposal is compared with other different schemes to show its effectiveness. Moreover, transfer learning is integrated with DRL to accelerate learning process.
Published in: IEEE Internet of Things Journal ( Volume: 6, Issue: 2, April 2019)
Funding Agency:
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Resource Management ,
- Selective Modulators ,
- Fog Radio Access Network ,
- Deep Learning ,
- Learning Rate ,
- Power Consumption ,
- Computational Resources ,
- Dynamic Conditions ,
- Internet Of Things ,
- Transfer Learning ,
- Control Network ,
- Mode Of Communication ,
- Deep Reinforcement Learning ,
- Edge Computing ,
- Long-term Consumption ,
- User Equipment ,
- Radio Resource ,
- Intelligent Decision ,
- System Power Consumption ,
- Edge Caching ,
- Quality Of Service Constraints ,
- Replay Memory ,
- Markov Decision Process ,
- Wireless Networks ,
- Service Quality ,
- Deep Reinforcement Learning Model ,
- Content Request ,
- State Space ,
- Update Frequency ,
- Energy Minimization Problem
- Author Keywords
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Resource Management ,
- Selective Modulators ,
- Fog Radio Access Network ,
- Deep Learning ,
- Learning Rate ,
- Power Consumption ,
- Computational Resources ,
- Dynamic Conditions ,
- Internet Of Things ,
- Transfer Learning ,
- Control Network ,
- Mode Of Communication ,
- Deep Reinforcement Learning ,
- Edge Computing ,
- Long-term Consumption ,
- User Equipment ,
- Radio Resource ,
- Intelligent Decision ,
- System Power Consumption ,
- Edge Caching ,
- Quality Of Service Constraints ,
- Replay Memory ,
- Markov Decision Process ,
- Wireless Networks ,
- Service Quality ,
- Deep Reinforcement Learning Model ,
- Content Request ,
- State Space ,
- Update Frequency ,
- Energy Minimization Problem
- Author Keywords