Abstract
In this paper, we consider a class of statistical models with a real-valued threshold parameter, which is either the minimum or the maximum of the support of the sampling distribution. We prove large deviation principles for sequences of estimators (maximum likelihood estimators and posterior distributions) as the sample size goes to infinity. Furthermore we illustrate some connections with the analogous large deviation results for the natural exponential families.
Similar content being viewed by others
References
Barndorff-Nielsen OE (1978) Information and exponential families in statistical theory. Wiley, New York
Casella G, Berger RL (1990) Statistical inference. Duxbury Press Wadsworth, Belmont
Cover TM, Thomas JA (1991) Elements of information theory. Wiley, New York
Dembo A, Zeitouni O (1993) Large deviations techniques and applications. Jones and Bartlett, Boston
Fu JC, Kass RE (1988) The exponential rates of convergence of posterior distributions. Ann Inst Stat Math 40: 683–691
Ganesh A, O’Connell N (1999) An inverse of Sanov’s Theorem. Stat Probab Lett 42: 201–206
Ganesh A, O’Connell N (2000) A large deviation principle for Dirichlet posteriors. Bernoulli 6: 1021–1034
Jupp PE, Mardia KV (1983) A note on the maximum-entropy principle. Scand J Stat 10: 45–47
Macci C, Petrella L (2009) Censored exponential data: large deviations for MLEs and posterior distributions. Comm Stat Theory Methods (in press)
Parthasarathy KR (1967) Probability measures on metric spaces. Academic Press, London
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Macci, C. Large deviations for estimators of some threshold parameters. Stat Methods Appl 19, 63–77 (2010). https://doi.org/10.1007/s10260-009-0119-y
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10260-009-0119-y