IN51A-1790
The Bayesian Data Fusion Approach to Spatial and Temporal Fusion of Remotely Sensed Images
Friday, 18 December 2015
Poster Hall (Moscone South)
Jie Xue, Yee Leung and Tung Fung, Chinese University of Hong Kong, Hong Kong, Hong Kong
Abstract:
Remote sensing is a rich source of data for the monitoring of spatial and temporal dynamics of geo-physical processes. However, single-sensor systems are constrained by their inherent characteristics in the spatial and temporal domains due to the limitation of the technology. To obtain images high in both spatial and temporal resolutions, a number of image fusion algorithms, such as STARFM and ESTARFM, have recently been developed. However, ESTARFM assumes a single rate of change over the entire time period while STARFM assumes a linear trend of change over two points in time. Such assumption of land-cover changes over a study period is in general unrealistic. Furthermore, existing methods have not taken into consideration the correlation information contained in the time-series of a phenomenon under study, which is instrumental to the fusion process. Therefore, to capitalize on information available in a fusion process, I propose to develop a Bayesian data fusion (BDF) approach to the fusion of remotely sensed data as an estimation problem where the fused image is to be estimated by employing a first order observation model, and then by combining the correlation and the change-tendency information in the time-series images in the form of joint distribution, and also by including the prior information of the desired image. On the basis of a general formulation, we have tested it with both simulated data and actual Landsat/ MODIS acquisitions in this study. Experimental results demonstrate that the proposed method out-performs STARFM and ESTARFM, especially for heterogeneous landscapes. It produced surface reflectances with the highest correlations to the reference Landsat images.