C32B-07
Comparison of Three Models for Snow Microwave Brightness Temperature Simulation
Wednesday, 16 December 2015: 11:50
3007 (Moscone West)
Alexandre Roy1, Alain Royer1, Benoit Montpetit1, Ghislain Picard2, Ludovic Brucker3 and Alexandre Langlois1, (1)University of Sherbrooke, Sherbrooke, QC, Canada, (2)LGGE Laboratoire de Glaciologie et Géophysique de l’Environnement, Saint Martin d'Hères, France, (3)NASA Goddard Space Flight Center, Greenbelt, MD, United States
Abstract:
This presentation compares three microwave radiative transfer models commonly used for snow brightness temperature (TB) simulations, namely: Dense Media Radiative Transfer – Multi Layers (DMRT-ML), Microwave Emission Model of Layered Snowpacks (MEMLS) and Helsinki University of Technology n-layers (HUT n-layers). Using the same new comprehensive sets of measured detailed snowpack physical properties (input data), we compared simulated TBs at 11, 19 and 37 GHz from these 3 models based on different electromagnetic approaches using three different snow grain metrics, i.e. respectively measured specific surface area (SSA), calculated correlation length using the Debye relationship and measured maximum extent. Comparison with surface-based radiometric measurements for different types of snow (in southern Québec, and in subarctic and arctic areas) shows similar averaged root mean square errors in the range of 10 K or less between measured and simulated TBs when simulations are optimized using scaling factors applied on these metrics. This means that, in practice, the different approaches of these models (physical to empirical) converge to similar results when driven with appropriate scaled in-situ measurements. We discuss the results relatively to the uncertainties in snow microstructure measurements. In particular, we show that the scaling factor to be applied on the SSA measurements in order to minimize the DMRT-ML simulated TBs compared to measured TBs is not due to uncertainty in SSA measurements.