Demo: Probability Inequalities and Limit Theorems

For Bernoulli samples, compare the exact binomial tail with several lecture bounds and one normal approximation.

Mathematical setup

Let $X_i\sim\mathrm{Bernoulli}(p)$ iid and $\bar X_n=n^{-1}\sum_i X_i$. The demo studies the two-sided deviation event

\[\lvert \bar X_n-p\rvert \geq \epsilon.\]

The exact probability is computed from $K=\sum_i X_i\sim\mathrm{Binomial}(n,p)$. Chebyshev gives

\[\Pr(\lvert \bar X_n-p\rvert\geq \epsilon)\leq \frac{p(1-p)}{n\epsilon^2}.\]

The Markov bar uses the two one-sided bounds

\[\Pr(\bar X_n\geq p+\epsilon)\leq \frac{p}{p+\epsilon}, \qquad \Pr(\bar X_n\leq p-\epsilon)\leq \frac{1-p}{1-p+\epsilon},\]

summed as a union bound when the corresponding events are possible. The Chernoff/KL form uses

\[\Pr(\bar X_n\geq p+\epsilon)\leq e^{-nD(p+\epsilon\,\Vert\,p)}, \qquad \Pr(\bar X_n\leq p-\epsilon)\leq e^{-nD(p-\epsilon\,\Vert\,p)},\]

where

\[D(q\,\Vert\,p)=q\log\frac{q}{p}+(1-q)\log\frac{1-q}{1-p}.\]

These KL terms are used when the shifted probabilities stay in $[0,1]$ and are then summed as a union bound for the two-sided event. Hoeffding uses $2e^{-2n\epsilon^2}$. The CLT bar is a continuity-corrected normal approximation, not an upper bound.

What to try

  • Start with a moderate deviation, then raise $\epsilon$. The exact tail should fall quickly, and the exponential bounds usually react faster than Chebyshev.
  • Move $p$ near 0.05 or 0.95. Notice where symmetric normal approximations become less reliable for small or moderate $n$.
  • Increase $n$ and watch the finite-sample upper bounds tighten. Markov remains intentionally crude because it uses very little structure.

Markov is shown as a two-sided union bound. Chebyshev, Chernoff, and Hoeffding are upper bounds; the CLT bar is a continuity-corrected approximation.

Back to topic notes