Asymptotic notation is used to express the results of [[asymptotic analysis]]. Interpret asymptotic notation as "roughly proportional to when $n$ is large". # $O$-notation $O$-notation ("Big-O notation") gives the asymptotic upper bound for an algorithm. The running time of the algorithm will grow no faster than this rate in the limit (which is also to say above some threshold input size $N_0$). Formally, there exists some $k > 0$, some $N_0 > 0$ that for all $n \ge N_0$ $f(n) \le g(n)$ > [!NOTE] Notation > Another way to write the above is > $ \exists k > 0, \ \exists N_0 >0, \ \forall n\ge N_0, \ f(n) \le kg(n)$ > where $\exists$ represents "there exists" and $\forall$ represents "for all". We write this as $f(n) = O(g(n))$ (pronounced "big-oh of g of n"). > [!NOTE] Notation > Technically, $O(g(n))$ includes a set of functions, and it would be more correct to write $f(n) \in O(g(n))$. However, we will use equality as it simplifies clutter in equations. You can choose any arbitrary $k>0$ to satisfy this condition, and any function $g(n)$ that grows faster than $f(n)$ technically is "Big-O" of $f(n)$. For example, the function $f(n) = an^2 + bn + c$ is $O(n^2)$ and $O(n^3)$ and, more generally, $O(n^c)$ for any constant $c\ge 2$. #diagram When stating that an algorithm is bounded by some function $g(n)$ that is not asymptotically tight, it is conventional to substitute the lower case $o$ (e.g., in the example above $O(n^3)$ is more properly written $o(n^3)$ and pronounced "little oh of..."). # $\Omega$-notation $\Omega$-notation ("Omega notation") gives the asymptotic lower bound of a function. The running time of the algorithm will grow at least as fast as $g(n)$ if $f(n)$ is $\Omega(g(n))$. Formally, there exists some $k > 0$, some $N_0 > 0$ that for all $n \ge N_0$ $f(n) \ge g(n)$ For example, the function $f(n) = an^2 + bn + c$ is $\Omega (n^2)$ and $\Omega (n)$ and, more generally, $O(n^c)$ for any constant $c\le 2$. As with $O$-notation, when the bound is not asymptotically tight it is more proper to substitute the lower case $\omega$ and say "little omega of...". # $\Theta$-notation $\Theta$-notation ("Theta notation") gives the **tight bound** on the asymptotic behavior of a function. The running time of the algorithm will grow precisely--based on the highest-order term--at this rate, give or take some constant factor. If a function $f(n)$ is both $O(f(n))$ and $\Omega(f(n))$ it is $\Theta(f(n)$. For example, the function $f(n) = an^2 + bn + c$ is both $O(n^2)$ and $\Omega (n^2)$ thus it is also $\Theta(n^2)$. In this case, it is most precise, and preferred, to characterize the worst-case running time of the algorithm as $\Theta (n^2)$. It is important to include the caveat "worst-case" as algorithms may have better average case and best case running times, *unless you are confident that the algorithm's average case and best case running times are the same*. > [!Tip] > Don't say "Big-O" when you mean "Theta" notation. They represent different things!