Optimizing Radiation Emulator Training: Streamlined Hyperparameter Tuning with Automated Sherpa

Research Square (Research Square)(2023)

引用 0|浏览0
暂无评分
摘要
Abstract This study aimed to determine the optimal configuration of neural network emulators for numerical weather prediction with minimized trial and error by comparing the performance of emulators utilizing neurons obtained from multiple hidden layers (1-5 layers) automatically defined by the Sherpa library. Findings revealed that emulators with Sherpa-determined neurons demonstrated good results, stable performance, and low errors in numerical simulations. Optimal configurations manifested in one and two hidden layers, displaying a moderate enhancement with the incorporation of dual hidden layers. The mean neuron quantity per hidden layer, ascertained by Sherpa, spanned from 153 to 440, culminating in a 7-12 fold acceleration augmentation. These insights could guide the development of radiative physical neural network emulators as automatically determined hyperparameters can effectively reduce trial and error processes while maintaining stable outcomes. Further experimentation is recommended to establish the best balance between speed and accuracy, as this study did not identify optimized values for all hyperparameters. Overall, this research highlights the importance of hyperparameter optimization in designing efficient and accurate neural network emulators for weather prediction.
更多
查看译文
关键词
radiation emulator training,streamlined hyperparameter tuning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要