Institute of Mathematical Statistics and Actuarial Science

Swiss Statistics Seminar, Spring 2017

An Entropy-Based Test for Multivariate Threshold Exceedances

Sebastian Engelke (École Polytechnique Fédérale de Lausanne)

Many effects of climate change seem to be reflected not in the mean temperatures, precipitation or other environmental variables, but rather in the frequency and severity of the extreme events in the distributional tails. Detecting such changes requires a statistical methodology that efficiently uses the largest observations in the sample. We propose a simple, non-parametric test that decides whether two multivariate distributions exhibit the same tail behavior. The test is based on the entropy, namely Kullback-Leibler divergence, between exceedances over a high threshold of the two multivariate random vectors. We show that such a type of divergence is closely related to the divergence between Bernoulli random variables. We study the properties of the test and further explore its effectiveness for finite sample sizes. As an application we apply the method to precipitation data where we test whether the marginal tails and/or the extremal dependence structure have changed over time.