Function Approximation with Neural Networks and Local Methods: Bias, Variance and Smoothness

msra(1996)

引用 114|浏览8
暂无评分
摘要
We review the use of global and local methods for estimating a function mapping from samples of the func- tion containing noise. The relationship between the methods is examined and an empirical comparison is performed using the multi-layer perceptron (MLP) global neural network model, the single nearest-neighbour model, a linear local approxi ma- tion (LA) model, and the following commonly used datasets: the Mackey-Glass chaotic time series, the Sunspot time series, British English Vowel data, TIMIT speech phonemes, build- ing energy prediction data, and the sonar dataset. We find that the simple local approximation models often outperform the MLP. No criterion such as classification/prediction, si ze of the training set, dimensionality of the training set, etc. c an be used to distinguish whether the MLP or the local approxima- tion method will be superior. However, we find that if we con- sider histograms of the -NN density estimates for the train- ing datasets then we can choose the best performing method a priori by selecting local approximation when the spread of the density histogram is large and choosing the MLP otherwise. This result correlates with the hypothesis that the global M LP model is less appropriate when the characteristics of the fu nc- tion to be approximated varies throughout the input space. We discuss the results, the smoothness assumption often made in function approximation, and the bias/variance dilemma.
更多
查看译文
关键词
neural network,time series,neural network model,function approximation,multi layer perceptron,density estimation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要