Abstract:
Neural Architecture Search (NAS) is a fast growing technology for automatic design of deep-learning architectures. NAS includes three stages: search space design, search ...Show MoreMetadata
Abstract:
Neural Architecture Search (NAS) is a fast growing technology for automatic design of deep-learning architectures. NAS includes three stages: search space design, search strategy, and evaluation criterion. Among these, the evaluation of various architectures is very cost-intensive task. In this work, we have proposed a set of receptive field reliant zero-cost proxies which need only one iteration of training and thereby reduce the computational time associated with evaluation criterion during the NAS. The proposed zero-cost proxies are based on layer-wise binding of the prune-at-initialization score with its receptive field for more effective measure as compared to the vanilla counterparts to achieve generalizability. The proposed zero-cost proxies are validated on the set of PyTorchCV models, and NAS-Bench-201 benchmarking datasets. The proposed zero-cost proxies have performed better for set of PyTorchCV models and competitively with vanilla counterparts for NAS-Bench-201. The efficiency of the proposed method is also demonstrated in NAS on NAS-Bench-201 using Aging Evolution as controller.
Published in: ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date of Conference: 04-10 June 2023
Date Added to IEEE Xplore: 05 May 2023
ISBN Information:
ISSN Information:
Samsung Research Institute Bangalore, India
Samsung Research Institute Bangalore, India
Samsung Research Institute Bangalore, India
Samsung Research Institute Bangalore, India
Samsung Research Institute Bangalore, India
Samsung Research Institute Bangalore, India
Samsung Research Institute Bangalore, India
Samsung Research Institute Bangalore, India