Long-term stability of TES satellite radiance measurements
Abstract. The utilization of Tropospheric Emission Spectrometer (TES) Level 2 (L2) retrieval products for the purpose of assessing long term changes in atmospheric trace gas composition requires knowledge of the overall radiometric stability of the Level 1B (L1B) radiances. The purpose of this study is to evaluate the stability of the radiometric calibration of the TES instrument by analyzing the difference between measured and calculated brightness temperatures in selected window regions of the spectrum. The Global Modeling and Assimilation Office (GMAO) profiles for temperature and water vapor and the Real-Time Global Sea Surface Temperature (RTGSST) are used as input to the Optimal Spectral Sampling (OSS) radiative transfer model to calculate the simulated spectra. The TES reference measurements selected cover a 4-year period of time from mid 2005 through mid 2009 with the selection criteria being; observation latitudes greater than −30° and less than 30°, over ocean, Global Survey mode (nadir view) and retrieved cloud optical depth of less than or equal to 0.01. The TES cloud optical depth retrievals are used only for screening purposes and no effects of clouds on the radiances are included in the forward model. This initial screening results in over 55 000 potential reference spectra spanning the four year period. Presented is a trend analysis of the time series of the residuals (observation minus calculations) in the TES 2B1, 1B2, 2A1, and 1A1 bands, with the standard deviation of the residuals being approximately equal to 0.6 K for bands 2B1, 1B2, 2A1, and 0.9 K for band 1A1. The analysis demonstrates that the trend in the residuals is not significantly different from zero over the 4-year period. This is one method used to demonstrate that the relative radiometric calibration is stable over time, which is very important for any longer term analysis of TES retrieved products (L2), particularly well-mixed species such as carbon dioxide and methane.