A stochastic optimization technique for hyperparameter tuning in reservoir computing

NEUROCOMPUTING(2024)

引用 0|浏览0
暂无评分
摘要
This paper presents a new approach to reservoir computing (RC) optimization. Reservoir computing (RC) is a framework for building recurrent neural networks (RNNs) that alleviates the well-known learning difficulties in such neural networks by training only the output layer. While the weights of the input and the nonlinear hidden layers are randomly generated, the adjustment of critical hyperparameters, such as the input and the feedback scaling factor, is essential for optimal performance in RC. While recent hardware implementations of RC are crucial for high-speed processing, standard gradient-based hyperparameter optimization is often irrelevant due to potentially uncertain or time-varying internal functions and parameters. In this work, we propose and analyze a stochastic optimization approach using gradient approximations based solely on noisy measurements of the loss function to circumvent this problem. Our numerical and experimental results confirm that the proposed method can provide near-optimal RC hyperparameters with substantial complexity reduction compared to competing methods, validating its potential for RC optimization.
更多
查看译文
关键词
Reservoir computing (RC),Hyperparameters,Stochastic optimization,Gradient approximation,RC hardware implementation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要