Divergence measures and their use in statistical inference
Vlad Ştefan Barbu
LMRS, University of Rouen-Normandy, France & Centre for Demographic Research Vladimir Trebici, INCE, Romanian Academy, Romania
Abstract:
This presentation is concerned with statistical methodology based on divergence measures.
Divergence measures are of great importance in statistical inference; equally important are their limiting versions, known as divergence rates. In the first part of our presentation, we focus on generalized divergence measures for Markov chains. We consider generalizations of Alpha divergence measure (Amari and Nagaoka, 2000) and Beta divergence measures (Basu et. al, 1998) and investigate their limiting behaviour. We also study the corresponding weighted generalized divergence measures and the associated rates (Belis and Guiasu, 1968; Guiasu, 1971; Kapur, 1994).
In the second part of our presentation, we focus on hypothesis testing based on weighted divergences. More precisely, we present a goodness of fit test and a homogeneity test and we study their performance. This type of tests based on weighted divergences allow us to focus on specific subsets of the support without, at the same time, losing the information of the others. With this method we achieve a significantly more sensitive test than the classical ones but with comparable error rates.