Advances in Retrieving XCH4 and XCO from Sentinel-5 Precursor: Improvements in the Scientific TROPOMI/WFMD Algorithm
ATMOSPHERIC MEASUREMENT TECHNIQUES(2023)
Univ Bremen FB1
Abstract
The TROPOspheric Monitoring Instrument (TROPOMI) on board the Sentinel-5 Precursor satellite enables the accurate determination of atmospheric methane (CH4) and carbon monoxide (CO) abundances at high spatial resolution and global daily sampling. Due to its wide swath and sampling, the global distribution of both gases can be determined in unprecedented detail. The scientific retrieval algorithm Weighting Function Modified Differential Optical Absorption Spectroscopy (WFMD) has proven valuable in simultaneously retrieving the atmospheric column-averaged dry-air mole fractions XCH4 and XCO from TROPOMI's radiance measurements in the shortwave infrared (SWIR) spectral range. Here we present recent improvements of the algorithm which have been incorporated into the current version (v1.8) of the TROPOMI/WFMD product. This includes processing adjustments such as increasing the polynomial degree to 3 in the fitting procedure to better account for possible spectral albedo variations within the fitting window and updating the digital elevation model to minimise topography-related biases. In the post-processing, the machine-learning-based quality filter has been refined using additional data when training the random forest classifier to further reduce scenes with residual cloudiness that are incorrectly classified as good. In particular, the cloud filtering over the Arctic ocean is considerably improved. Furthermore, the machine learning calibration, addressing systematic errors due to simplifications in the forward model or instrumental issues, has been optimised. By including an additional feature associated with the fitted polynomial when training the corresponding random forest regressor, spectral albedo variations are better accounted for. To remove vertical stripes in the XCH4 and XCO data, an efficient orbit-wise destriping filter based on combined wavelet–Fourier filtering has been implemented, while optimally preserving the original spatial trace gas features. The temporal coverage of the data records has been extended to the end of April 2022, covering a total length of 4.5 years since the start of the mission, and will be further extended in the future. Validation with the ground-based Total Carbon Column Observing Network (TCCON) demonstrates that the implemented improvements reduce the pseudo-noise component of the products, resulting in an improved random error. The XCH4 and XCO products have similar spatial coverage from year to year including high latitudes and the oceans. The analysis of annual growth rates reveals accelerated growth of atmospheric methane during the covered period, in line with observations at marine surface sites of the Global Monitoring Division of NOAA's Earth System Research Laboratory, which reported consecutive annual record increases over the past 2 years of 2020 and 2021.
MoreTranslated text
Key words
Atmospheric Composition
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined