Noise Performance Analysis and Optimization of Downsampling Heterodyne Φ-OTDR

IEEE Sensors Journal(2024)

引用 0|浏览1
暂无评分
摘要
The downsampling scheme of the Rayleigh backscattered signal in heterodyne phase-sensitive optical time domain reflectometry (Φ-OTDR) has been proven valid to reduce computational resource consumption, which is a critical issue in areas such as real-time monitoring applications. However, the influence of downsampling on the system’s sensing performances has not been explored. In this work, we established a model for the deterioration of noise caused by downsampling. The model reveals that the degradation of the power spectral density (PSD) of the demodulated phase noise is attributed to the out-of-band noise aliasing into the beat frequency signal. Moreover, the model provides the PSD degradation at different downsampling frequencies. The upper bound of the PSD degradation has been verified in the experiment, without any anti-aliasing operations, which demonstrates good consistency with the theoretical and simulation predictions (the PSD deviation values are less than 0.5 dB). Additionally, we optimized the downsampling process using an anti-aliasing filter with a more matched filter characteristic, resulting in reduced noise PSD which still falls within the PSD degradation range. To the best of our knowledge, this is the first detailed analysis of the downsampling influence on Φ-OTDR sensing performance. It provides theoretical guidance for practical engineering applications of downsampling.
更多
查看译文
关键词
Phase-sensitive optical time domain reflectometry,downsampling,noise analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要