摘要

This study compares the simulation biases of Advanced Himawari Imager (AHI) brightness temperature to observations made at night over China through the use of three land surface emissivity (LSE) datasets. The University of Wisconsin-Madison High Spectral Resolution Emissivity dataset, the Combined Advanced Spaceborne Thermal Emission and Reflection Radiometer and Moderate Resolution Imaging Spectroradiometer Emissivity database over Land High Spectral Resolution Emissivity dataset, and the International Geosphere-Biosphere Programme (IGBP) infrared LSE module, as well as land skin temperature observations from the National Basic Meteorological Observing stations in China are used as inputs to the Community Radiative Transfer Model. The results suggest that the standard deviations of AHI observations minus background simulations (OMBs) are largely consistent for the three LSE datasets. Also, negative biases of the OMBs of brightness temperature uniformly occur for each of the three datasets. There are no significant differences in OMB biases estimated with the three LSE datasets over cropland and forest surface types for all five AHI surface-sensitive channels. Over the grassland surface type, significant differences (similar to 0.8 K) are found at the 10.4-, 11.2-, and 12.4-m channels if using the IGBP dataset. Over nonvegetated surface types (e.g., sandy land, gobi, and bare rock), the lack of a monthly variation in IGBP LSE introduces large negative biases for the 3.9- and 8.6-m channels, which are greater than those from the two other LSE datasets. Thus, improvements in simulating AHI infrared surface-sensitive channels can be made when using spatially and temporally varying LSE estimates.