Bayesian models have become very popular over the last years in several fields such as statistics, signal processing, and machine learning. Bayesian inference needs the approximation of complicated integrals involving the posterior distribution. For this purpose, Monte Carlo (MC) methods such as Markov Chain Monte Carlo (MCMC) and Importance Sampling (IS) algorithms, are often employed. In this work, we introduce a compressed MC (C-MC) scheme in order to compress the information obtained previously by MC sampling. The basic C-MC version is based on the stratification technique, well-known for variance reduction purposes. Deterministic C-MC schemes are also presented, which provide very good performance. The compression problem is strictly related to moment matching approach applied in different filtering methods, often known as Gaussian quadrature rules or sigma-point methods. The connections to herding algorithms and quasi-Monte Carlo perspective are also discussed. C-MC is particularly useful in a distributed Bayesian inference framework, when cheap and fast communications with a central processor are required. Numerical results confirm the benefit of the introduced schemes, outperforming the corresponding benchmark methods.
Comments: 16 Pages.
[v1] 2018-11-29 14:45:29
Unique-IP document downloads: 10 times
Vixra.org is a pre-print repository rather than a journal. Articles hosted may not yet have been verified by peer-review and should be treated as preliminary. In particular, anything that appears to include financial or legal advice or proposed medical treatments should be treated with due caution. Vixra.org will not be responsible for any consequences of actions that result from any form of use of any documents on this website.
Add your own feedback and questions here:
You are equally welcome to be positive or negative about any paper but please be polite. If you are being critical you must mention at least one specific error, otherwise your comment will be deleted as unhelpful.