Abstract:
The architecture of a neural network serves as the foundation for deep learning models and plays a crucial role in efficient training and application. Designing high-perf...Show MoreMetadata
Abstract:
The architecture of a neural network serves as the foundation for deep learning models and plays a crucial role in efficient training and application. Designing high-performance neural network architectures typically requires extensive expertise, along with numerous trials and optimizations of hyperparameters and network structure parameters. This process demands significant time and effort. To automate the optimization of LSTM (Long Short-Term Memory), this paper combines evolutionary algorithms for LSTM neural network architecture search and hyperparameter tuning. Additionally, to prevent LSTM from getting stuck in local optima during backpropagation, particle swarm optimization is employed to initialize the weights of LSTM. Experimental results on real-world datasets demonstrate that the proposed algorithm can provide different neural network architectures tailored to different datasets, achieving competitive predictive capabilities.
Published in: ICASSP 2025 - 2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date of Conference: 06-11 April 2025
Date Added to IEEE Xplore: 07 March 2025
ISBN Information: