Improving sparsity in kernel adaptive filters using a unit-norm dictionary

2017 22nd International Conference on Digital Signal Processing (DSP)(2017)

引用 1|浏览20
暂无评分
摘要
Kernel adaptive filters, a class of adaptive nonlinear time-series models, are known by their ability to learn expressive autoregressive patterns from sequential data. However, for trivial monotonic signals, they struggle to perform accurate predictions and at the same time keep computational complexity within desired boundaries. This is because new observations are incorporated to the dictionary when they are far from what the algorithm has seen in the past. We propose a novel approach to kernel adaptive filtering that compares new observations against dictionary samples in terms of their unit-norm (normalised) versions, meaning that new observations that look like previous samples but have a different magnitude are not added to the dictionary. We achieve this by proposing the unit-norm Gaussian kernel and define a sparsification criterion for this novel kernel. This new methodology is validated on two real-world datasets against standard KAF in terms of the normalised mean square error and the dictionary size.
更多
查看译文
关键词
sequential data,normalised mean square error,unit-norm dictionary,unit-norm Gaussian kernel,trivial monotonic signals,expressive autoregressive patterns,adaptive nonlinear time-series models,kernel adaptive filters
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要