Get Extreme Value Theory essential facts below. View Videos or join the Extreme Value Theory discussion. Add Extreme Value Theory to your PopFlock.com topic list for future reference or share this resource on social media.
Two approaches exist for practical extreme value analysis.
The first method relies on deriving block maxima (minima) series as a preliminary step. In many situations it is customary and convenient to extract the annual maxima (minima), generating an "Annual Maxima Series" (AMS).
The second method relies on extracting, from a continuous record, the peak values reached for any period during which values exceed a certain threshold (falls below a certain threshold). This method is generally referred to as the "Peak Over Threshold"  method (POT).
For AMS data, the analysis may partly rely on the results of the Fisher-Tippett-Gnedenko theorem, leading to the generalized extreme value distribution being selected for fitting. However, in practice, various procedures are applied to select between a wider range of distributions. The theorem here relates to the limiting distributions for the minimum or the maximum of a very large collection of independentrandom variables from the same distribution. Given that the number of relevant random events within a year may be rather limited, it is unsurprising that analyses of observed AMS data often lead to distributions other than the generalized extreme value distribution (GEVD) being selected.
For POT data, the analysis may involve fitting two distributions: one for the number of events in a time period considered and a second for the size of the exceedances.
The field of extreme value theory was pioneered by Leonard Tippett (1902-1985). Tippett was employed by the British Cotton Industry Research Association, where he worked to make cotton thread stronger. In his studies, he realized that the strength of a thread was controlled by the strength of its weakest fibres. With the help of R. A. Fisher, Tippet obtained three asymptotic limits describing the distributions of extremes assuming independent variables. Emil Julius Gumbel codified this theory in his 1958 book Statistics of Extremes, including the Gumbel distributions that bear his name. These results can be extended to allowing for slight correlations between variables, but the classical theory does not extend to strong correlations of the order of the variance. One universality class of particular interest is that of log-correlated fields, where the correlations decay logarithmically with the distance.
In practice, we might not have the distribution function but the Fisher-Tippett-Gnedenko theorem provides an asymptotic result. If there exist sequences of constants and such that
where depends on the tail shape of the distribution.
When normalized, G belongs to one of the following non-degenerate distribution families:
when the distribution of has a light tail with finite upper bound. Also known as Type 3.
Gumbel law: when the distribution of has an exponential tail. Also known as Type 1
Fréchet Law: when the distribution of has a heavy tail (including polynomial decay). Also known as Type 2.
In all cases, .
Extreme value theory in more than one variable introduces additional issues that have to be addressed. One problem that arises is that one must specify what constitutes an extreme event. Although this is straightforward in the univariate case, there is no unambiguous way to do this in the multivariate case. The fundamental problem is that although it is possible to order a set of real-valued numbers, there is no natural way to order a set of vectors.
As an example, in the univariate case, given a set of observations it is straightforward to find the most extreme event simply by taking the maximum (or minimum) of the observations. However, in the bivariate case, given a set of observations , it is not immediately clear how to find the most extreme event. Suppose that one has measured the values at a specific time and the values at a later time. Which of these events would be considered more extreme? There is no universal answer to this question.
Another issue in the multivariate case is that the limiting model is not as fully prescribed as in the univariate case. In the univariate case, the model (GEV distribution) contains three parameters whose values are not predicted by the theory and must be obtained by fitting the distribution to the data. In the multivariate case, the model not only contains unknown parameters, but also a function whose exact form is not prescribed by the theory. However, this function must obey certain constraints.
As an example of an application, bivariate extreme value theory has been applied to ocean research.
^Orsini, F.; Gecchele, G.; Gastaldi, M.; Rossi, R. (2019). "Collision prediction in roundabouts: a comparative study of extreme value theory approaches". Transportmetrica A: Transport Science. 15 (2): 556-572. doi:10.1080/23249935.2018.1515271. S2CID158343873.
^Beirlant, Jan; Goegebeur, Yuri; Teugels, Jozef; Segers, Johan (2004-08-27). Statistics of Extremes: Theory and Applications. Wiley Series in Probability and Statistics. Chichester, UK: John Wiley & Sons, Ltd. doi:10.1002/0470012382. ISBN9780470012383.
Burry K.V. (1975). Statistical Methods in Applied Science. John Wiley & Sons.
Castillo E. (1988) Extreme value theory in engineering. Academic Press, Inc. New York. ISBN0-12-163475-2.
Castillo, E., Hadi, A. S., Balakrishnan, N. and Sarabia, J. M. (2005) Extreme Value and Related Models with Applications in Engineering and Science, Wiley Series in Probability and Statistics Wiley, Hoboken, New Jersey. ISBN0-471-67172-X.
Coles S. (2001) An Introduction to Statistical Modeling of Extreme Values. Springer, London.
Embrechts P., Klüppelberg C. and Mikosch T. (1997) Modelling extremal events for insurance and finance. Berlin: Spring Verlag