Enhanced Fault Detection for GNSS/INS Integration Using Maximum Correntropy Filter and Local Outlier Factor
IEEE TRANSACTIONS ON INTELLIGENT VEHICLES(2024)
Beijing Jiaotong Univ
Abstract
Fault detection is crucial to isolate positioning risks for safety-critical applications using Global Navigation Satellite Systems. Conventional Kalman filter-based fault detection methods mainly focus on satellite measurement faults and presume that the test statistics follow the chi-square distribution. These methods ignore the adverse effect of undetected faults occurring previously, and the detection performance would be restrained under a mis-matched distribution assumption. To solve these issues, an enhanced fault detection method is proposed in this paper, which combines the Maximum Correntropy Criterion (MCC) and Local Outlier Factor (LOF). The MCC is introduced to derive a robust extended Kalman filter to deal with the undetected faults. Simultaneously, a specific Kernel Bandwidth (KB) for each measurement is calculated by the innovation and innovation covariance matrix to handle the inherent restriction of a fixed KB. Moreover, the LOF is used to reconstruct the test statistics, and the threshold is calculated by an offline model. Simulations are conducted to evaluate the proposed method under different fault scenarios. The results illustrate that the adaptive robust estimation reduces the negative influence of undetected faults, which makes the filter innovation follow the actual fault amplitudes. The proposed algorithm effectively improves the fault detection rate and positioning accuracy.
MoreTranslated text
Key words
Fault detection,Technological innovation,Estimation,Satellites,Monitoring,Mathematical models,Rail transportation,Fault detection,integrated navigation,local outlier factor,maximum correntropy,satellite positioning
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined