# Chernoff bound proof

The statement and proof of a typical Chernoff bound.

# A Chernoff bound, and proof

Lemma (Chernoff bound [1, 2]).

Let be a sum of independent random variables, where each is in and is fixed. Let with .

(a) If for all then

(b) If for all then

Proof.

Here is a proof for part (a).

\begin{equation} \label{eqn} \Pr [ Y \ge (1+\varepsilon )\mu T] ~ =~ \Pr \Big[ (1+\varepsilon )^ Y \ge (1+\varepsilon )^{(1+\varepsilon )\mu T}\Big] ~ \le ~ \frac{\textrm{E}[ (1+\varepsilon )^ Y ]}{(1+\varepsilon )^{(1+\varepsilon )\mu T}}. \end{equation}

(The inequality above follows from the Markov bound.)

Next we bound the above expectation. The independence of the ’s implies

$\textrm{E}\big [ (1+\varepsilon )^ Y \big ] ~ =~ \prod _{t=1}^ T \textrm{E}[ (1+\varepsilon )^{y_ t} ] ~ \le ~ \prod _{t=1}^ T \textrm{E}[ 1+\varepsilon y_ t ] ~ \le ~ \prod _{t=1}^ T 1+\varepsilon \mu ~ \lt ~ e^{\varepsilon \mu T}.$

(The second step uses the convexity of as a function of , specifically, for . The third step uses . The fourth step uses for .)

The rest is algebra. Combining the above bound with (\ref{eqn}), the probability in question is less than

$\frac{e^{\varepsilon \mu T}}{(1+\varepsilon )^{(1+\varepsilon )\mu T}} ~ =~ \Big( \frac{e^{\varepsilon }}{(1+\varepsilon )^{1+\varepsilon }} \Big)^{\mu T} ~ \lt ~ \exp (-\varepsilon ^2 \mu T/3).$

(The last inequality follows from for .)

The proof for part (b) is similar — change the sign of throughout, and use .

Exercise.

Prove part (b) of the Chernoff bound.

Remark: the proof needs only first-order (linear) approximations.

The particular inequalities used in the proof above are elegant and convenient, but other inequalities could be used just as well. For example, we could change the base of the exponent in the proof from to and then push the proof through using inequalities such as . The proof will work with any inequalities in which the first-order terms (for small ) are tight, although the resulting bound may have a smaller constant in the exponent.

Remark: upper bounds when is large.

Sometimes one wants an upper bound on the probability of a large upper deviation: where is larger than 2. The proof above for part (a) applies up to the last step, showing an upper bound of . For example, if you throw balls in bins, taking for some , the event that a given bin has more than balls is less than .

Pessimistic estimators.

When the time comes to apply the method of conditional probabilities to an existence proof that uses the Chernoff bound, you’ll need a pessimistic estimator for the bound.