Processing math: 100%
M²NAS: Joint Neural Architecture Optimization System With Network Transmission | IEEE Journals & Magazine | IEEE Xplore

M²NAS: Joint Neural Architecture Optimization System With Network Transmission


Abstract:

Differentiable neural architecture search (NAS) methods have achieved comparable results for low search costs and high performance. Existing differentiable methods focus ...Show More

Abstract:

Differentiable neural architecture search (NAS) methods have achieved comparable results for low search costs and high performance. Existing differentiable methods focus on searching microstructures in micro space, lacking in exploring macrostructures. However, different networks should have different macrostructures rather than all evenly distributed. This article proposes an M2NAS optimization system to optimize macrostructure and microstructure jointly. Specifically, we initialize a network with maximum complexity, where down-samplings occur at the beginning. Then, iteratively optimize the macrostructure and microstructure. For macro search, we establish a macro space and explore this space by generating candidates and initializing weights and architectural parameters for candidate networks, i.e., network transmission. For micro search, we propose a progressive pruning method to eliminate the vast quantization error caused by one-time pruning. With the number of parameters decreasing during the search, M2NAS obtains a series of networks with different complexity through network selection, forming a Pareto-optimal set. Results show that networks obtained by M2NAS have a better tradeoff between accuracy and complexity. A network searched on ImageNet achieves 77.4% and 76.3% top-1 accuracy, with and without data augment during retraining. When taking different pretrained models as backbones and combining them with Faster RCNN on COCO, our model can get 35.6% (AP) and 55.7% (AP_{50}) , higher than typical NAS models, second only to ResNet-50 results.
Page(s): 2631 - 2642
Date of Publication: 21 November 2022

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.