LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Agapiou, Sergios
Languages: English
Types: Doctoral thesis
Subjects: QA
The goal of this thesis is to contribute to the formulation and understanding\ud of the Bayesian approach to inverse problems in function space. To this end\ud we examine two important aspects of this approach: the frequentist asymptotic\ud properties of the posterior, and the extraction of information from the posterior\ud via sampling. We work in a separable Hilbert space setting and consider Gaussian\ud priors on the unknown in conjugate Gaussian models. In the first part of this\ud work we consider linear inverse problems with Gaussian additive noise and study\ud the contraction in the small noise limit of the Gaussian posterior distribution to\ud a Dirac measure centered on the true parameter underlying the data. In a wide\ud range of situations, which include both mildly and severely ill-posed problems, we\ud show how carefully calibrating the scaling of the prior as a function of the size of\ud the noise, based on a priori known information on the regularity of the truth, yields\ud optimal rates of contraction. In the second part we study the implementation in\ud RN of hierarchical Bayesian linear inverse problems with Gaussian noise and priors,\ud and with hyper-parameters introduced through the scalings of the prior and noise\ud covariance operators. We use function space intuition to understand the large N\ud behaviour of algorithms designed to sample the posterior and show that the two\ud scaling hyper-parameters evolve under these algorithms in contrasting ways: as N\ud grows the prior scaling slows down while the noise scaling speeds up. We propose\ud a reparametrization of the prior scaling which is robust with respect to the increase\ud in dimension. Our theory on the slowing down of the evolution of the prior scaling\ud extends to hierarchical approaches in more general conjugate Gaussian settings,\ud while our intuition covers other parameters of the prior covariance operator as well.\ud Throughout the thesis we use a blend of results from measure theory and probability\ud theory with tools from the theory of linear partial differential equations and\ud numerical analysis.

Share - Bookmark

Cite this article