WeChat Mini Program
Old Version Features

Improved Inland Water Level Estimates with Sentinel-6 Fully Focused SAR Processing: A Case Study in the Ebre River Basin

Xavier Domingo,Ferran Gibert, Robert Molina,Maria Jose Escorihuela

Remote Sensing(2025)

Cited 0|Views1
Abstract
The observation of small to medium inland water targets with nadir radar altimeters is currently limited by the along-track resolution of UnFocused SAR (UFSAR) altimetry, which is approximately 300 m for Delay-Doppler processors. In this study, we analyze the benefits of the sub-meter along-track resolution provided by Fully Focused SAR (FFSAR) altimetry applied to Sentinel-6 Michael Freilich data over a collection of small to medium targets in the Ebre Basin, Spain. The obtained water level estimations over a 2-year period are compared to in situ data to evaluate the long-term accuracy of the algorithm. The proposed FFSAR altimetry methodology achieves an average MAD precision of roughly 4 cm, and allows for a full operational implementation as it can be processed in a totally unsupervised manner. The precision improvement with respect to Delay-Doppler products over the same targets is essentially attributed to the FFSAR capabilities to better filter out waveforms contaminated by off-nadir scatterers. Moreover, we evaluate the application of extended water masks, which exploit nadir–altimeter measurements where water is at nadir or up to 250 m across-track from nadir to increase the number of acquisitions while maintaining the same level of accuracy, increasing by an average of 48% the number of valid measurements per pass, while maintaining the same level of accuracy as nadir measurements over water. We thus demonstrate the potential of FFSAR altimetry to monitor the water level of small to medium inland water targets.
More
Translated text
Key words
radar altimetry,FFSAR,UFSAR,water level,inland waters
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined