Abstract:
Nowadays, Automated Machine Learning (AutoML) has gradually become an inevitable trend providing automatic and suitable solutions to address AI tasks without needing more...Show MoreMetadata
Abstract:
Nowadays, Automated Machine Learning (AutoML) has gradually become an inevitable trend providing automatic and suitable solutions to address AI tasks without needing more efforts from experts. Neural Architecture Search (NAS), a subfield of AutoML, has generated automated models solving fundamental problems in computer vision such as image recognition, objects detection. NAS with differentiable search strategies has reduced significantly the GPU time that occupancy on calculation. In this paper, we present an effective algorithm that allows expanding search spaces by selecting operation candidates from the initial set with different ways in concurrent execution. The extended search space makes NAS having more opportunities to find good architectures simultaneously by running the group of search spaces in overlapping time periods instead of sequentially. Our approach, is called Accelerated NAS, shortens 1.8x searching time when comparing to previous works. In addition, the Accelerated NAS generates potential neural architectures having comparable performances with the low inference time.
Published in: 2021 International Conference on Information and Communication Technology Convergence (ICTC)
Date of Conference: 20-22 October 2021
Date Added to IEEE Xplore: 07 December 2021
ISBN Information:
Print on Demand(PoD) ISSN: 2162-1233