Evaluation of GEOS-Simulated L-Band Microwave Brightness Temperature Using Aquarius Observations over Non-Frozen Land across North America

Loading...
Thumbnail Image

Files

Publication or External Link

Date

2020-09-22

Advisor

Citation

Park, J.; Forman, B.A.; Reichle, R.H.; De Lannoy, G.; Tarik, S.B. Evaluation of GEOS-Simulated L-Band Microwave Brightness Temperature Using Aquarius Observations over Non-Frozen Land across North America. Remote Sens. 2020, 12, 3098.

Abstract

L-band brightness temperature (𝑇𝑏) is one of the key remotely-sensed variables that provides information regarding surface soil moisture conditions. In order to harness the information in 𝑇𝑏 observations, a radiative transfer model (RTM) is investigated for eventual inclusion into a data assimilation framework. In this study, 𝑇𝑏 estimates from the RTM implemented in the NASA Goddard Earth Observing System (GEOS) were evaluated against the nearly four-year record of daily 𝑇𝑏 observations collected by L-band radiometers onboard the Aquarius satellite. Statistics between the modeled and observed 𝑇𝑏 were computed over North America as a function of soil hydraulic properties and vegetation types. Overall, statistics showed good agreement between the modeled and observed 𝑇𝑏 with a relatively low, domain-average bias (0.79 K (ascending) and −2.79 K (descending)), root mean squared error (11.0 K (ascending) and 11.7 K (descending)), and unbiased root mean squared error (8.14 K (ascending) and 8.28 K (descending)). In terms of soil hydraulic parameters, large porosity and large wilting point both lead to high uncertainty in modeled 𝑇𝑏 due to the large variability in dielectric constant and surface roughness used by the RTM. The performance of the RTM as a function of vegetation type suggests better agreement in regions with broadleaf deciduous and needleleaf forests while grassland regions exhibited the worst accuracy amongst the five different vegetation types.

Notes

Rights