H11M-08
Likelihood-Informed Dimension Reduction for Bayesian Inverse Problems
Monday, 14 December 2015: 09:45
3016 (Moscone West)
Tiangang Cui, Massachusetts Institute of Technology, Cambridge, MA, United States
Abstract:
The intrinsic dimensionality of an inverse problem is affected by prior information, the accuracy and number of observations, and the smoothing properties of the forward operator. From a Bayesian perspective, changes from the prior to the posterior may, in many problems, be confined to a relatively low-dimensional subspace of the parameter space. We present a dimension reduction approach that defines and identifies such a subspace, called the "likelihood- informed subspace" (LIS), by characterizing the relative influences of the prior and the likelihood over the support of the posterior distribution. This identification enables new and more efficient computational methods for Bayesian inference with nonlinear forward models and Gaussian priors. In particular, we approximate the posterior distribution as the product of a lower-dimensional posterior defined on the LIS and the prior distribution marginalized onto the complementary subspace. Markov chain Monte Carlo sampling can then proceed in lower dimensions, with significant gains in computational efficiency. We also introduce a Rao-Blackwellization strategy that de-randomizes Monte Carlo estimates of posterior expectations for additional variance reduction. The efficacy of our method has been demonstrated on a wide range of applications, including atmospherical remote sensing, X-Ray imaging and groundwater inversion.