Table of Contents

Master Theorem

Also called the Master Method.

Used to determine the asymptotic complexity of this recurrence:

(1)
$$ \begin{align} T(n) = a T\big(\frac{n}{b}\big) + O(n^d) \end{align} $$

Here a is the number of recursive calls at each step. For example in merge sort a = 2, since the algorithm is called recursively on the lower and upper halves of the array. The theorem assumes the problem is broken up into a equally sized subproblems.

b is the denominator of the fraction describing how much smaller the input size becomes at each step. For merge sort the size of the input is halved, so b = 2.

d describes the algorithmic complexity of the work required to combine the results of the recursive calls. In the case of merge sort a single pass over the data is needed so the complexity is O(n) and d = 1.

To apply the theorem, one compares a to bd and applies the formula for the appropriate case:

(2)
$$ \begin{align} \begin{cases} O(n^d \log n) & a = b^d\\ O(n^d) & a < b^d \\ O(n^{\log_b a}) & a > b^d \end{cases} \end{align} $$