# Master Theorem
**An asymptotic analysis, using [[big O Notation]], of the [[Recurrence Relation|recurrence relations]] describing the behaviour of [[divide and conquer]] algorithms.**
Consider a recursive divide and conquer algorithm which divides an input of size $n$ into $a$ subproblems each of size $\frac{n}{b}$[^1]. The subproblems are the parts which are further recursed on. The total work the algorithm must complete is equal to the *sum of the work* performed on *every subproblem* plus the process of *creating and combining* the subproblems.
![[RecursionTree.svg]]
[^2]
Thus, the [[time complexity]] of the algorithm can be expressed by the *recurrence relation*
$T(n)=a\;T\left(\frac{n}{b}\right)+f(n)$
where $f(n)$ is the time required for the division and combining steps.
> [!Example] Master Theorem
> For a recursive divide and conquer algorithm with time complexity
> $T(n)=a\;T\left(\frac{n}{b}\right)+f(n)$
> where $a\ge 1$ is an integer, $b>1$ is a real number, $f(n)>0$ is a non-decreasing function defined on positive integers, and $a$ and $b$ are no dependent on $n$, there are three main cases for its asymptotic bound which depend on the *critical exponent*
> $c^{*}=\log_{b}a$
> - If $f(n)=O(n^{c^{*}-\varepsilon})$ for some $\varepsilon > 0$, then
> $T(n)=\Theta(n^{c^{*}})$
> and the work is dominated by conquering the subproblems.
> - If $f(n)=\Theta(n^{c^{*}}\log^{k}n)$ for some $k\ge 0$, then
> $T(n)=\Theta(n^{c^{*}}\log^{k+1} n)$
> and the work in conquering the subproblems is roughly equal to the work dividing and combining the subproblems.
> - If $f(n)=\Omega(n^{c^{*}+\varepsilon})$ for some $\varepsilon>0$, and for some $k<1$ and some $n_0$,
> $a\;f\left(\frac{n}{b}\right)\le kf(n)$
> holds for all $n>n_0$, then $T(n)=\Theta(f(n))$ and the work is dominated by dividing and combining the subproblems.
> [!Example]- Example - merge sort
> In [[merge sort]], each recursion call divides the list into $2$ sublists of size $\frac{n}{2}$. Thus, $a=2$ and $b=2$ and the critical exponent is
> $c^{*}=\log_{2}2=1$
> As the combining step of each recursion call requires $n$ constant time operations, $f(n)=O(n)$ and the recurrence relation is
> $T(n)=2T\left(\frac{n}{2}\right)+O(n)$
> Applying the master theorem states that since $f(n)=\Theta(n^{c^{*}})$ where $c^{*}=1$, then the time complexity is
> $T(n)=\Theta(n^{c^{*}}\log n)=O(n\log n)$
[^1]: The size of the subproblems are often not exactly equal. However, an asymptotic analysis only requires that on average the size of the subproblems is $\frac{n}{b}$.
[^2]: Every instance splits into $a$ instances.