Abstract:
Differentiable architecture search has demonstrated promising results in automatically designing neural network architectures with desired properties, such as high accura...Show MoreMetadata
Abstract:
Differentiable architecture search has demonstrated promising results in automatically designing neural network architectures with desired properties, such as high accuracy and low FLOPs. However, it suffers from a cumbersome training process, and the injection of constraints in the search phase often relies on some hand-crafted heuristic regularizers, the design of which typically requires tremendous human effort. In this paper, to address these critical challenges, we present FGA-NAS, an efficient method for resource-constrained architecture search. First, to reduce the computational cost and improve search flexibility, we propose a novel condensed search space that merges multiple parallel-placed candidates into a single one. Second, to enable the gradient-based optimization for neural architecture search (NAS) under multiple combinatorial constraints, we decompose the constrained NAS into a few simple sub-problems without introducing any heuristics by using the ADMM algorithm [1]. Then, the constrained NAS can be resolved by alternately solving the simple sub-problems. Experimental results on ImageNet show that our method can discover efficient and accurate neural network architectures that achieve the state-of-the-art by only using 0.2 GPU days.
Date of Conference: 18-23 July 2022
Date Added to IEEE Xplore: 30 September 2022
ISBN Information: