# “Stopping-time” Chernoff bounds

Extending the Chernoff bound to handle sums with randomly many terms.

Alice goes to the casino and plays bets on a sequence of fair coin flips. On the th bet, Alice chooses an amount to bet: she wins if this flip is heads, otherwise she loses . Since it’s Vegas, she never stops playing. Fix any . Let be the sum of bets won after the th bet. Let be the sum of bets lost after the th bet. Will Alice ever reach a time such that ? The bound below says that the probability that she does is less than .

# “Stopping-time” Chernoff bound

The bound applies to a system that goes through a random sequence of states , where each state determines two values: and , each in , where the expectation of is at most that of :

$\textrm{E}[x_ t \, |\, S_{t-1}]~ \le ~ \textrm{E}[y_ t \, |\, S_{t-1}].$

The bound shows that is unlikely to ever significantly exceed (for any ).

Lemma (stopping-time Chernoff).

Given the conditions above, let with . The probability that

\begin{equation} \label{event2} \textstyle \exists t.~ ~ \sum _{i=1}^ t x_ i/(1+\varepsilon ) ~ -~ \sum _{i=1}^ t y_ i/(1-\varepsilon ) ~ \ge ~ \varepsilon \mu \end{equation}

is less than .

(Various bounds of this kind are possible. The constants in this bound are not optimized.)

# Proof

The proof adapts the standard Chernoff proof.

# Pessimistic estimator

Note that in the proof above also serves as a pessimistic estimator on the probability of event \eqref{event2}: that is, (a) its initial value, , is , and (b) for each , given the current state , the conditional probability of \eqref{event2} is less than (so the event won’t happen as long as the algorithm keeps ), and (c) is a super-martingale.

# Related

## Footnotes

1. Let and . \eqref{event2} implies . Using for gives