Statistics

1409 Submissions

[4] viXra:1409.0127 [pdf] replaced on 2015-03-17 07:17:05

Rates of Convergence of Lognormal Extremes Under Power Normalization

Authors: Jianwen Huang, Shouquan Chen
Comments: 10 Pages.

Let $\{X_n,n\geq1\}$ be an independent and identically distributed random sequence with common distribution $F$ obeying the lognormal distribution. In this paper, we obtain the exact uniform convergence rate of the distribution of maxima to its extreme value limit under power normalization.
Category: Statistics

[3] viXra:1409.0119 [pdf] submitted on 2014-09-15 10:24:34

Tail Behavior of the Generalized Exponential and Maxwell Distributions

Authors: Jianwen Huang, Shouquan Chen
Comments: 9 Pages.

Motivated by Finner et al. (2008), the asymptotic behavior of the probability density function (pdf) and the cumulative distribution function (cdf) of the generalized exponential and Maxwell distributions are studied. Specially, we consider the asymptotic behavior of the ratio of the pdfs (cdfs) of the generalized exponential and Student's $t$-distributions (likewise for the Maxwell and Student's $t$-distributions) as the degrees of freedom parameter approach infinity in an appropriate way. As by products, Mills' ratios for the generalized exponential and Maxwell distributions are gained. Moreover, we illustrate some examples to indicate the application of our results in extreme value theory.
Category: Statistics

[2] viXra:1409.0051 [pdf] replaced on 2016-05-27 09:29:29

On Multiple Try Schemes and the Particle Metropolis-Hastings Algorithm

Authors: L. Martino, F. Leisen, J. Corander
Comments: 21 Pages.

Markov Chain Monte Carlo (MCMC) algorithms and Sequential Monte Carlo (SMC) methods (a.k.a., particle filters) are well-known Monte Carlo methodologies, widely used in different fields for Bayesian inference and stochastic optimization. The Multiple Try Metropolis (MTM) algorithm is an extension of the standard Metropolis- Hastings (MH) algorithm in which the next state of the chain is chosen among a set of candidates, according to certain weights. The Particle MH (PMH) algorithm is another advanced MCMC technique specifically designed for scenarios where the multidimensional target density can be easily factorized as multiplication of conditional densities. PMH combines jointly SMC and MCMC approaches. Both, MTM and PMH, have been widely studied and applied in literature. PMH variants have been often applied for the joint purpose of tracking dynamic variables and tuning constant parameters in a state space model. Furthermore, PMH can be also considered as an alternative particle smoothing method. In this work, we investigate connections, similarities and differences among MTM schemes and PMH methods. This study allows the design of novel efficient schemes for filtering and smoothing purposes in state space models. More specially, one of them, called Particle Multiple Try Metropolis (P-MTM), obtains very promising results in different numerical simulations.
Category: Statistics

[1] viXra:1409.0015 [pdf] replaced on 2014-12-15 15:30:35

Sequential Monte Carlo Methods for Filtering of Unobservable Componenets of Multidimensional Diffusion Markov Processes

Authors: Ellida M. Khazen
Comments: Pages. The paper is being publuished in Cogent Mathematics (2016), 2:1134031. http://dx.doi.org/10.1080/23311835.2015.1134031

The problem of filtering of unobservable components x(t) of a multidimensional continuous diffusion Markov process z(t)=(x(t),y(t)), given the observations of the (multidimensional) process y(t) taken at discrete consecutive times with small time steps, is analytically investigated. On the base of that investigation the new algorithms for simulation of unobservable components, x(t), and the new algorithms of nonlinear filtering with the use of sequential Monte Carlo methods, or particle filters, are developed and suggested. The analytical investigation of observed quadratic variations is also developed. The new closed form analytical formulae are obtained, which characterize dispersions of deviations of the observed quadratic variations and the accuracy of some estimates for x(t). As an illustrative example, estimation of volatility (for the problems of financial mathematics) is considered. The obtained new algorithms extend the range of applications of sequential Monte Carlo methods, or particle filters, beyond the hidden Markov models and improve their performance.
Category: Statistics