SNR: Symbolic network-based rectifiable learning framework for symbolic regression

Neural Networks(2023)

引用 0|浏览19
暂无评分
摘要
Symbolic regression (SR) can be utilized to unveil the underlying mathematical expressions that describe a given set of observed data. At present, SR can be categorized into two methods: learning-from-scratch and learning-with-experience. Compared to learning-from-scratch, learning-with-experience yields results that are comparable to those of several benchmarks and incurs significantly lower time costs for obtaining expressions. However, the learning-with-experience model performs poorly in terms of unseen data distributions and lacks a rectification tool, apart from constant optimization, which exhibits limited performance. In this study, we propose a Symbolic Network-based Rectifiable Learning Framework (SNR) that possesses the ability to correct errors. SNR adopts Symbolic Network (SymNet) to represent an expression, and the encoding of SymNet is designed to provide supervised information, with numerous self-generated expressions, to train a policy net (PolicyNet). The training of PolicyNet can offer prior knowledge to guide effective searches. Subsequently, the incorrectly predicted expressions are revised via a rectification mechanism. This rectification mechanism endows SNR with broader applicability. Experimental results demonstrate that our proposed method achieves the highest averaged coefficient of determination on self-generated datasets when compared with other state-of-the-art methods and yields more accurate results in public datasets.
更多
查看译文
关键词
Symbolic regression,Symbolic network,Learning-from-scratch,Learning-with-experience
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要