In differentiable search architecture search methods,a more efficient search space design can significantly improve the performance of the searched architecture,thus requiring people to carefully define the search spa...In differentiable search architecture search methods,a more efficient search space design can significantly improve the performance of the searched architecture,thus requiring people to carefully define the search space with different complexity according to various operations.Meanwhile rationalizing the search strategies to explore the well-defined search space will further improve the speed and efficiency of architecture search.With this in mind,we propose a faster and more efficient differentiable architecture search method,AllegroNAS.Firstly,we introduce a more efficient search space enriched by the introduction of two redefined convolution modules.Secondly,we utilize a more efficient architectural parameter regularization method,mitigating the overfitting problem during the search process and reducing the error brought about by gradient approximation.Meanwhile,we introduce a natural exponential cosine annealing method to make the learning rate of the neural network training process more suitable for the search procedure.Moreover,group convolution and data augmentation are employed to reduce the computational cost.Finally,through extensive experiments on several public datasets,we demonstrate that our method can more swiftly search for better-performing neural network architectures in a more efficient search space,thus validating the effectiveness of our approach.展开更多
基金This work was supported in part by the National Natural Science Foundation of China under Grant 61305001the Natural Science Foundation of Heilongjiang Province of China under Grant F201222.
文摘In differentiable search architecture search methods,a more efficient search space design can significantly improve the performance of the searched architecture,thus requiring people to carefully define the search space with different complexity according to various operations.Meanwhile rationalizing the search strategies to explore the well-defined search space will further improve the speed and efficiency of architecture search.With this in mind,we propose a faster and more efficient differentiable architecture search method,AllegroNAS.Firstly,we introduce a more efficient search space enriched by the introduction of two redefined convolution modules.Secondly,we utilize a more efficient architectural parameter regularization method,mitigating the overfitting problem during the search process and reducing the error brought about by gradient approximation.Meanwhile,we introduce a natural exponential cosine annealing method to make the learning rate of the neural network training process more suitable for the search procedure.Moreover,group convolution and data augmentation are employed to reduce the computational cost.Finally,through extensive experiments on several public datasets,we demonstrate that our method can more swiftly search for better-performing neural network architectures in a more efficient search space,thus validating the effectiveness of our approach.