Efficient Machine Learning-Enhanced Channel Estimation For Ofdm Systems

IEEE ACCESS(2021)

引用 7|浏览1
暂无评分
摘要
Recently much research work has focused on employing deep learning (DL) algorithms to perform channel estimation in the upcoming 6G communication systems. However, these DL algorithms are usually computationally demanding and require a large number of training samples. Hence, this work investigates the feasibility of designing efficient machine learning (ML) algorithms that can effectively estimate and track time-varying, frequency-selective channels. The proposed algorithm is integrated with orthogonal frequency-division multiplexing (OFDM) to eliminate intersymbol interference (ISI) induced by the frequency-selective multipath channel and compared with the well-known least square (LS) and linear minimum mean square error (LMMSE) channel estimation algorithms. The obtained results have demonstrated that even when a small number of pilot samples, N-P, is inserted before the N subcarriers OFDM symbol, the introduced ML-based channel estimation is superior to the LS and LMMSE algorithms. This dominance is reflected in the bit-error-rate (BER) performance of the proposed algorithm, which attains a gain of 2.5 dB and 5.5 dB over the LMMSE and LS algorithms, respectively, when N-P = N/8. Furthermore, the BER performance of the proposed algorithm is shown to degrade by only 0.2 dB when the maximum Doppler frequency is randomly varied. Finally, the number of iterations required by the proposed algorithm to converge to the smallest achievable mean-squared error (MSE) are thoroughly examined for various signalto-noise ratio (SNR) levels.
更多
查看译文
关键词
Deep learning (DL), machine learning (ML), orthogonal frequency-division multiplexing (OFDM), intersymbol interference (ISI)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要