Novel Shuffle Algorithm for Efficient Training of Neuron Network Digital Predistortion

2023 IEEE MTT-S International Wireless Symposium (IWS)(2023)

引用 0|浏览0
暂无评分
摘要
This paper presents a shuffle algorithm to optimize the training sequence for the stochastic and the mini-batch training of neuron network (NN) digital predistortion (DPD) technique. The proposed training sequence can ensure an efficient convergence by reducing the correlation of continuous inputs. Experiments are validated through an augmented real-valued time-delay neural network (ARVTDNN). Experiment results show that the proposed training sequence can significantly improve the convergence speeds and the linearization performance.
更多
查看译文
关键词
Digital predistortion (DPD),efficient convergence,mini-batch,stochastic,training sequence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要