In probability theory, Eaton's inequality is a bound on the largest values of a linear combination of bounded random variables. This inequality was described in 1974 by Morris L. Eaton.[1]

Statement of the inequality

Let {Xi} be a set of real independent random variables, each with an expected value of zero and bounded above by 1 ( |Xi | ≤ 1, for 1 ≤ in). The variates do not have to be identically or symmetrically distributed. Let {ai} be a set of n fixed real numbers with

Eaton showed that

where φ(x) is the probability density function of the standard normal distribution.

A related bound is Edelman's

where Φ(x) is cumulative distribution function of the standard normal distribution.

Pinelis has shown that Eaton's bound can be sharpened:[2]

A set of critical values for Eaton's bound have been determined.[3]

Let {ai} be a set of independent Rademacher random variablesP( ai = 1 ) = P( ai = −1 ) = 1/2. Let Z be a normally distributed variate with a mean 0 and variance of 1. Let {bi} be a set of n fixed real numbers such that

This last condition is required by the Riesz–Fischer theorem which states that

will converge if and only if

is finite.

Then

for f(x) = | x |p. The case for p ≥ 3 was proved by Whittle[4] and p ≥ 2 was proved by Haagerup.[5]


If f(x) = eλx with λ ≥ 0 then

where inf is the infimum.[6]


Let


Then[7]

The constant in the last inequality is approximately 4.4634.


An alternative bound is also known:[8]

This last bound is related to the Hoeffding's inequality.


In the uniform case where all the bi = n−1/2 the maximum value of Sn is n1/2. In this case van Zuijlen has shown that[9]

where μ is the mean and σ is the standard deviation of the sum.

References

  1. Eaton, Morris L. (1974) "A probability inequality for linear combinations of bounded random variables." Annals of Statistics 2(3) 609–614
  2. Pinelis, I. (1994) "Extremal probabilistic problems and Hotelling's T2 test under a symmetry condition." Annals of Statistics 22(1), 357–368
  3. Dufour, J-M; Hallin, M (1993) "Improved Eaton bounds for linear combinations of bounded random variables, with statistical applications", Journal of the American Statistical Association, 88(243) 1026–1033
  4. Whittle P (1960) Bounds for the moments of linear and quadratic forms in independent variables. Teor Verojatnost i Primenen 5: 331–335 MR0133849
  5. Haagerup U (1982) The best constants in the Khinchine inequality. Studia Math 70: 231–283 MR0654838
  6. Hoeffding W (1963) Probability inequalities for sums of bounded random variables. J Amer Statist Assoc 58: 13–30 MR144363
  7. Pinelis I (1994) Optimum bounds for the distributions of martingales in Banach spaces. Ann Probab 22(4):1679–1706
  8. de la Pena, VH, Lai TL, Shao Q (2009) Self normalized processes. Springer-Verlag, New York
  9. van Zuijlen Martien CA (2011) On a conjecture concerning the sum of independent Rademacher random variables. https://arxiv.org/abs/1112.4988
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.