Probability
5. Inequalities & Limit Theorems
Markov's Inequality

Markov's Inequality

Markov's inequality is a distribution-free tail bound. It says that if a random variable is non-negative, then its mean controls how large it can be.

Statement

If X0X \ge 0 and a>0a>0:

P(Xa)E[X]a.P(X \ge a) \le \frac{E[X]}{a}.

Intuition

If the average of XX is small, then XX cannot be “large” too often — otherwise the mean would be forced upward.

Markov is often the first stepping stone to stronger bounds:

  • Apply Markov to (Xμ)2(X-\mu)^2 to get Chebyshev’s inequality.

When it’s useful

  • When you only know E[X]E[X] (mean) and nothing else about the distribution
  • For quick, conservative guarantees
  • In proofs (building block for Chebyshev and concentration results)

Tightness (why it can be loose)

Markov can be conservative because it uses only the mean. If the distribution has heavy tails or lots of mass near 0, the bound may be far from the true probability.

Test Your Knowledge

Example: Using Markov's inequality

A factory produces lightbulbs with an average lifespan of 1000 hours. The distribution is unknown, but lifespan must be non-negative. Use Markov's Inequality to find an upper bound on the probability that a lightbulb lasts at least 4000 hours.

View Step-by-Step Solution

Markov's formula: P(Xa)E[X]aP(X \ge a) \le \frac{E[X]}{a}.

  • E[X]=1000E[X] = 1000
  • a=4000a = 4000

P(X4000)10004000=0.25P(X \ge 4000) \le \frac{1000}{4000} = 0.25.

Without knowing anything else about the distribution, we guarantee that at most 25% of the lightbulbs will last 4000 hours or more.