摘要

The propagation of spaceborne radar signals operating at L-band frequency or below can be seriously affected by the ionosphere. At high states of solar activity, Faraday rotation (FR) and signal path delays disturb radar polarimetry and reduce resolution in range and azimuth. While these effects are negligible at X-band, FR and the frequency-dependent path delays can become seriously problematic starting at L-band. For quality assurance and calibration purposes, existing L-band or potential spaceborne P-band missions require the estimation of the ionospheric state before or during the data take. This paper introduces two approaches for measuring the ionospheric total electron content (TEC) from single-polarized spaceborne SAR data. The two methods are demonstrated using simulations. Both methods leverage knowledge of the frequency-dependent path delay through the ionosphere: The first estimates TEC from the phase error of the filter mismatch, while the second gauges path-delay differences between up and down chirps. FR, mean (direct current) offsets, and noise contributions are also considered in the simulations. Finally, possibilities for further methodological improvements are discussed.

  • 出版日期2010-6