In probability theory , to obtain a nondegenerate limiting distribution for extremes of samples , it is necessary to "reduce" the actual greatest value by applying a linear transformation with coefficients that depend on the sample size.
If X 1 , X 2 , … , X n {\displaystyle \ X_{1},\ X_{2},\ \dots ,\ X_{n}\ } are independent random variables with common probability density function P ( X j = x ) ≡ f X ( x ) , {\displaystyle \ \mathbb {P} \left(X_{j}=x\right)\equiv f_{X}(x)\ ,}
then the cumulative distribution function F Y n {\displaystyle \ F_{Y_{n}}\ } for Y n ≡ max { X 1 , … , X n } {\displaystyle \ Y_{n}\equiv \max\{\ X_{1},\ \ldots ,\ X_{n}\ \}\ } is given by the simple relation
F Y n ( y ) = [ F X ( y ) ] n . {\displaystyle F_{Y_{n}}(y)=\left[\ F_{X}(y)\ \right]^{n}~.} If there is a limiting distribution for the distribution of interest, the stability postulate states that the limiting distribution must be for some sequence of transformed or "reduced" values, such as ( a n Y n + b n ) , {\displaystyle \ \left(\ a_{n}\ Y_{n}+b_{n}\ \right)\ ,} where a n , b n {\displaystyle \ a_{n},\ b_{n}\ } may depend on n but not on x . This equation was obtained by Maurice René Fréchet and also by Ronald Fisher .
Only three possible distributions [ edit ] To distinguish the limiting cumulative distribution function from the "reduced" greatest value from F ( x ) , {\displaystyle \ F(x)\ ,} we will denote it by G ( y ) . {\displaystyle \ G(y)~.} It follows that G ( y ) {\displaystyle \ G(y)\ } must satisfy the functional equation
[ G ( y ) ] n = G ( a n y + b n ) . {\displaystyle \ \left[\ G\!\left(y\right)\ \right]^{n}=G\!\left(\ a_{n}\ y+b_{n}\ \right)~.} Boris Vladimirovich Gnedenko has shown there are no other distributions satisfying the stability postulate other than the following three:[ 1]
Gumbel distribution for the minimum stability postulate If X i = Gumbel ( μ , β ) {\displaystyle \ X_{i}={\textrm {Gumbel}}\left(\ \mu ,\ \beta \right)\ } and Y ≡ min { X 1 , … , X n } {\displaystyle \ Y\equiv \min\{\ X_{1},\ \ldots ,\ X_{n}\ \}\ } then Y ∼ a n X + b n , {\displaystyle \ Y\sim a_{n}\ X+b_{n}\ ,} where a n = 1 {\displaystyle \ a_{n}=1\ } and b n = β log n ; {\displaystyle \ b_{n}=\beta \ \log n\ ;} In other words, Y ∼ Gumbel ( μ − β log n , β ) . {\displaystyle \ Y\sim {\textsf {Gumbel}}\left(\ \mu -\beta \ \log n\ ,\ \beta \ \right)~.}
Weibull distribution (extreme value) for the maximum stability postulate If X i = Weibull ( μ , σ ) {\displaystyle \ X_{i}={\textsf {Weibull}}\left(\ \mu ,\ \sigma \ \right)\ } and Y ≡ max { X 1 , … , X n } {\displaystyle \ Y\equiv \max\{\,X_{1},\ldots ,X_{n}\,\}\ } then Y ∼ a n X + b n , {\displaystyle \ Y\sim a_{n}\ X+b_{n}\ ,} where a n = 1 {\displaystyle \ a_{n}=1\ } and b n = σ log ( 1 n ) ; {\displaystyle \ b_{n}=\sigma \ \log \!\left({\tfrac {1}{n}}\right)\ ;} In other words, Y ∼ Weibull ( μ − σ log ( 1 n ) , σ ) . {\displaystyle \ Y\sim {\textsf {Weibull}}\left(\ \mu -\sigma \log \!\left({\tfrac {1}{n}}\ \right)\ ,\ \sigma \ \right)~.}
Fréchet distribution for the maximum stability postulate If X i = Frechet ( α , s , m ) {\displaystyle \ X_{i}={\textsf {Frechet}}\left(\ \alpha ,\ s,\ m\ \right)\ } and Y ≡ max { X 1 , … , X n } {\displaystyle \ Y\equiv \max\{\ X_{1},\ \ldots ,\ X_{n}\ \}\ } then Y ∼ a n X + b n , {\displaystyle \ Y\sim a_{n}\ X+b_{n}\ ,} where a n = n − 1 α {\displaystyle \ a_{n}=n^{-{\tfrac {1}{\alpha }}}\ } and b n = m ( 1 − n − 1 α ) ; {\displaystyle \ b_{n}=m\left(1-n^{-{\tfrac {1}{\alpha }}}\right)\ ;} In other words, Y ∼ Frechet ( α , n 1 α s , m ) . {\displaystyle \ Y\sim {\textsf {Frechet}}\left(\ \alpha ,n^{\tfrac {1}{\alpha }}s\ ,\ m\ \right)~.} ^ Gnedenko, B. (1943). "Sur La Distribution Limite Du Terme Maximum D'Une Serie Aleatoire". Annals of Mathematics . 44 (3): 423– 453. doi :10.2307/1968974 .