Optimizing a Dual-band Antenna for IoT Systems by Machine Learning Algorithms | IEEE Conference Publication | IEEE Xplore

Optimizing a Dual-band Antenna for IoT Systems by Machine Learning Algorithms


Abstract:

The IoT system has developed strongly with the appearance of many compact devices connected through diverse transmission technologies and using different popular frequenc...Show More

Abstract:

The IoT system has developed strongly with the appearance of many compact devices connected through diverse transmission technologies and using different popular frequency bands. However, designing and optimizing compact, multi-band antennas for IoT systems using traditional methods requires significant time and effort. Machine Learning (ML) is emerging as a new trend in antenna research, supporting the design and optimization process. Current research often focuses only on optimizing the performance of single-band antennas using traditional structures such as circular and rectangular patch antennas. It requires manual calculation of antenna size parameters. This paper proposed a new method of optimizing the antenna's size parameter, frequency, and bandwidth called Machine Learning Antenna (MLA). MLA uses machine learning algorithms to calculate the size of the dual-band antenna from the desired resonant frequencies and bandwidths with a mean squared error (MSE) of above 0.94. The dual-band antenna for the IoT system is designed using the MLA method with a small size of 30 mm x 40 mm, using an FR-4 substrate with a thickness of 1.6 mm. This antenna uses Defected Ground Structure (DGS) and shorting pin methods to reduce the size, operating flexibly on the 2.4 GHz, 5 GHz, and 6 GHz frequency bands, suitable for IoT, IoV, WiFi 6, WiFi 7, 4G, 5G systems.
Date of Conference: 17-19 October 2024
Date Added to IEEE Xplore: 07 March 2025
ISBN Information:

ISSN Information:

Conference Location: Ho Chi Minh City, Vietnam

Contact IEEE to Subscribe

References

References is not available for this document.