inner-banner-bg

Journal of Mathematical Techniques and Computational Mathematics(JMTCM)

ISSN: 2834-7706 | DOI: 10.33140/JMTCM

Impact Factor: 1.3

Evolution and Efficiency in Neural Architecture Search: Bridging the Gap between Expert Design and Automated Optimization

Abstract

Fanfei Meng, Chen-Ao Wang and Lele Zhang

The paper provides a comprehensive overview of Neural Architecture Search (NAS), emphasizing its evolution from manual design to automated, computationally driven approaches. It covers the inception and growth of NAS, highlighting its application across various domains, including medical imaging and natural language processing. The document details the shift from expert-driven design to algorithm-driven processes, exploring initial methodologies like reinforcement learning and evolutionary algorithms. It also discusses the challenges of computational demands and the emergence of efficient NAS methodologies, such as Differentiable Architecture Search and hardware-aware NAS. The paper further elaborates on NAS's application in computer vision, NLP, and beyond, demonstrating its versatility and potential for optimizing neural network architectures across different tasks. Future directions and challenges, including computational efficiency and the integration with emerging AI domains, are addressed, showcasing NAS's dynamic nature and its continued evolution towards more sophisticated and efficient architecture search methods.

PDF