Effective Sample Size for Importance Sampling Based on Discrepancy Measures

Authors: L. Martino, V. Elvira, F. Louzada

The Effective Sample Size (ESS) is an important measure of efficiency of Monte Carlo methods such as Markov Chain Monte Carlo (MCMC) and Importance Sampling (IS) techniques. In the IS context, an approximation $\widehat{ESS}$ of the theoretical ESS definition is widely applied, involving the inverse of the sum of the squares of the normalized importance weights. This formula, $\widehat{ESS}$, has become an essential piece within Sequential Monte Carlo (SMC) methods, to assess the convenience of a resampling step. From another perspective, the expression $\widehat{ESS}$ is related to the Euclidean distance between the probability mass described by the normalized weights and the discrete uniform probability mass function (pmf). In this work, we derive other possible ESS functions based on different discrepancy measures between these two pmfs. Several examples are provided involving, for instance, the geometric mean of the weights, the discrete entropy (including the {\it perplexity} measure, already proposed in literature) and the Gini coefficient among others. We list five theoretical requirements which a generic ESS function should satisfy, allowing us to classify different ESS measures. We also compare the most promising ones by means of numerical simulations.

Comments: Signal Processing, Volume 131, Pages: 386-401, 2017

Download: PDF

Submission history

[v1] 2016-02-09 14:48:10
[v2] 2016-02-10 07:48:50
[v3] 2016-02-14 08:13:03
[v4] 2016-02-19 04:23:27
[v5] 2016-02-20 06:30:34
[v6] 2016-03-05 09:11:03
[v7] 2016-09-23 03:15:35

Unique-IP document downloads: 52 times

Add your own feedback and questions here:
You are equally welcome to be positive or negative about any paper but please be polite. If you are being critical you must mention at least one specific error, otherwise your comment will be deleted as unhelpful.

comments powered by Disqus