Abstract:
Smart grids have attracted more and more attention due to the characteristic of itself that provide much more interactive solution comparing from the conventional grid, s...Show MoreMetadata
Abstract:
Smart grids have attracted more and more attention due to the characteristic of itself that provide much more interactive solution comparing from the conventional grid, such as enabling demand side management (DSM) and demand response and so on. In smart grid, Wide Area Networks (WAN) is a part of the largest communication network that provides communication to the energy generation domains such as solar panels. Internet-of-Thing (IoT) can be adopted to provide communication in WAN. However, IoT may suffer in term of quality of services as well as energy efficiency because of how far the data must transmit since WAN cover the largest area. Therefore IoT-Edge Computing can be adopted to process the data nearer to the sources of data generation before transmitting into the cloud services. Energy efficiency in IoT is critical especially when smart grid is considered. Specifically, when a fault occurs in a grid, the sensors and the technologies in the areas affected by the fault will be isolated and have the shortage of power supply. In this paper, we proposed power control method to curb the problem state before. To find the best action, deep reinforcement learning (DRL) is adopted. DRL has the ability to learn make decision in a dynamic environment through a combination of deep learning and reinforcement learning techniques. Simulation results show that with the proposed method, the energy efficiency manage to elevate.
Published in: 2023 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET)
Date of Conference: 12-14 September 2023
Date Added to IEEE Xplore: 27 October 2023
ISBN Information: