====== Asymptotic Notations ====== * Contacts: [[mihai.dumitru2201@stud.acs.upb.ro|Dumitru Mihai-Valentin]], [[matei.popovici@cs.pub.ro|Matei Popovici]] * Published: 9.11.2015 ===== Outline ===== An **execution time** is a function $math[T:\mathbb{N}\rightarrow\mathbb{N}], which maps a value $math[n] representing **the size of the input** of an algorithm to the **number of instructions** (**execution steps**) performed by the algorithm. Suppose $math[f: \mathbb{R}^{+} \rightarrow \mathbb{R}^{+}] is a non-necessarily monotone function over positive reals. We define the sets $math[X(f(n)) \subseteq \mathbb{H}om(\mathbb{R}^{+},\mathbb{R}^{+})], with $math[X\in\{O,o,\Theta,\omega,\Omega\}], which we call **asymptotic notations** (or simply notations). **Question**: //Why are asymptotic notations defined for positive real functions, instead of positive integers? (as in execution times)// ===== Notations ===== ==== $\Theta$ (Theta) Notation ==== $$ \Theta(f(n)) = \{ g: \mathbb{R}^{+} \rightarrow \mathbb{R}^{+}\ |\ \begin{array}{l} \exists c_1, c_2 \in \mathbb{R}^+\cr \exists n_0 \in \mathbb{N} \end{array}\ such\ that\ \forall\ n \ge n_0,\ \ c_1f(n) \le g(n) \le c_2f(n) \}$$ {{ :aa:lab:theta_fn.png |}} We say that $math[\Theta(f(n))] is the class (set) of functions with **the same asymptotic growth as** $math[f(n)]. === Example === Let $math[f(n) = n^2 + 8] and $math[g(n) = 3n^2 + 4n + 2]\\ We see that, for $math[n_0 = 1, c_1 = 1, c_2 = 4 \Rightarrow \ n^2 + 8 \le 3n^2 + 4n + 2 \le 4n^2 + 32]\\ Thus, $math[g(n) \in \Theta(f(n))] ==== $math[O] (Big O) Notation ==== $$ O(f(n)) = \{ g: \mathbb{R}^{+} \rightarrow \mathbb{R}^{+}\ |\ \begin{array}{l} \exists c \in \mathbb{R}^+\cr \exists n_0 \in \mathbb{N} \end{array}\ such\ that\ \forall\ n \ge n_0,\ \ 0 \le g(n) \le cf(n) \}$$ {{ :aa:lab:o_fn.png |}} We say that $math[O(f(n))] is the class (set) of functions that grow asymptotically **at most as much as** $math[f(n)]. === Example === Let $math[f(n) = n\log(n) + n] and $math[g(n) = n\log(n) + n + 10000]\\ We see that, for $math[n_0 = 10000, c = 2 \Rightarrow 0 \le n\log(n) + n + 10000 \le 2n\log(n) + 2n]\\ Thus, $math[g(n) \in O(f(n))] ==== $math[\Omega] (Big Omega) Notation ==== $$ \Omega(f(n)) = \{ g: \mathbb{R}^{+} \rightarrow \mathbb{R}^{+}\ |\ \begin{array}{l} \exists c \in \mathbb{R}^+\cr \exists n_0 \in \mathbb{N} \end{array}\ such\ that\ \forall\ n \ge n_0,\ \ 0 \le cf(n) \le g(n) \}$$ {{ :aa:lab:omega_fn.png |}} We say that $math[\Omega(f(n))] is the class (set) of functions that grow asymptotically **at least as much as** $math[f(n)]. === Example === Let $math[f(n) = n^4] and $math[g(n) = 2^n]\\ We see that, for $math[n_0 = 16, c = 1 \Rightarrow 0 \le n^4 \le 2^n]\\ Thus, $math[g(n) \in \Omega(f(n))] ==== $math[o] (Little O) Notation ==== $$ o(f(n)) = \{ g: \mathbb{R}^{+} \rightarrow \mathbb{R}^{+}\ |\ \begin{array}{l} \forall c \in \mathbb{R}^+\cr \exists n_0 \in \mathbb{N} \end{array}\ such\ that\ \forall\ n \ge n_0,\ \ 0 \le g(n) \le cf(n) \}$$ We say that $math[o(f(n))] is the class (set) of functions that grows asymptotically **strictly less than** $math[f(n)]. === Example === Let $math[f(n) = n^2 + 3n + 6] and $math[g(n) = 8n + 3]\\ We see that, for $math[c = 1, n_0 = 5 \Rightarrow 0 \le 8n + 3 \le n^2 + 3n + 6]\\ for $math[c = 0.5, n_0 = 7 \Rightarrow 0 \le 8n + 3 \le 0.5n^2 + 1.5n + 3]\\ for $math[c = 0.1, n_0 = 77 \Rightarrow 0 \le 8n + 3 \le 0.1n^2 + 0.3n + 0.6]\\ etc.\\ Thus, $math[g(n) \in o(f(n))] ==== $\omega$ (Little Omega) Notation ==== $$ \omega(f(n)) = \{ g: \mathbb{R}^{+} \rightarrow \mathbb{R}^{+}\ |\ \begin{array}{l} \forall c \in \mathbb{R}^+\cr \exists n_0 \in \mathbb{N} \end{array}\ such\ that\ \forall\ n \ge n_0,\ \ 0 \le cf(n) \le g(n) \}$$ We say that $math[\omega(f(n))] is the class (set) of functions that grows asymptotically **strictly more than** $math[f(n)]. === Example === Let $math[f(n) = \log(n)] and $math[g(n) = n]\\ We see that, for $math[c = 1, n_0 = 1 \Rightarrow 0 \le \log(n) \le n]\\ for $math[c = 10, n_0 = 64 \Rightarrow 0 \le 10 \log(n) \le n]\\ for $math[c = 100, n_0 = 1024 \Rightarrow 0 \le 100 \log(n) \le n]\\ etc.\\ Thus, $math[g(n) \in \omega(f(n))] ===== Syntactic sugars ====== Asymptotic notations are often used to refer to arbitrary functions that have a certain growth. To simplify things, we can write //arithmetic expressions// as follows: $$ f(n) = \Theta(n) + O(\log(n)) $$\\ which should be read as: $math[\exists g \in \Theta(n)] and $math[\exists h \in O(\log(n))] such that $math[f(n) = g(n) + h(n),\ \forall n \in \mathbb{R}^{+}]. We can also write //equations// like the following: $$\Theta(n^2) = O(n^2) + o(n)$$\\ which should be read as: $math[\forall f \in \Theta(n^2),\ \exists g \in O(n^2)] and $math[h \in o(n)] such that $math[f(n) = g(n) + h(n),\ \forall n \in \mathbb{R}^{+}]. Note that equations are not symmetric and should only be read from left to right. Consider: $$\Theta(n) = O(n)$$ While it is true that, for any function in $math[\Theta(n)] there is a function equal to it in $math[O(n)], we can clearly see that there are functions in $math[O(n)] for which there is no correspondent in $math[\Theta(n)] (e.g. $math[f(n) = 1, f(n) = \log(n)] etc.) As a rule, each asymptotic notation on the left side of the equal sign should be read as an **universally quantified** function ($math[\forall f]) from that class and each asymptotic notation on the right should be read as an **existentially quantified** function ($math[\exists g]) from that class. $$\left(\frac{\omega(n^2)}{\Theta(n)}\right) = \Omega(n) + o(n)$$ $math[\forall f \in \omega(n^2)\ and\ \forall g \in \Theta(n),\ \exists h \in \Omega(n)] and $math[\exists j \in o(n)] such that $math[\left(\frac{f(n)}{g(n)}\right) = h(n) + j(n),\ \forall n \in \mathbb{R}^{+}] ===== (Counter)intuitive examples ===== Which of the following is true: * $math[o(f(n)) \cap \omega(f(n)) = \emptyset] * $math[o(f(n)) \subsetneq O(f(n))] * $math[O(f(n)) \setminus o(f(n)) = \Theta(f(n))] ===== Exercises (asymptotic notations) ===== Check the truth of the following: * $math[\sqrt{n} \in O(\log(n))] * $math[\log(n) \in O(\log(\log(n)))] * $math[n \in O(\log(n)\cdot\sqrt{n})] * $math[n + \log(n) \in \Theta(n)] Prove that: * $math[\log(n\cdot \log(n))\in\Theta(\log(n))] * $math[\sqrt{n}\in\omega(\log(n))] * $math[f(n) + g(n) \in O(n\cdot\log(n))] for $math[f(n)\in\Theta(n)] and $math[g(n)\in O(n\cdot\log n)] ===== Exercises (syntactic sugars) ===== * $math[\frac{O(n\sqrt{n})}{\Theta(n)} = \ldots] * $math[\frac{\Theta(n)}{O(\log(n))} = \ldots] * $math[\lvert{\Theta(f(n)) - \Theta(f(n))\rvert} = \ldots] ===== Practice ===== Prove/disprove the following: * $math[f(n) = \Omega(\log(n))] and $math[g(n)=O(n) \implies f(n)=\Omega(\log(g(n))] * $math[f(n) = \Omega(\log(n))] and $math[g(n)=O(n) \implies f(n)=\Theta(\log(g(n))] * $math[f(n) = \Omega(g(n))] and $math[g(n)=O(n^2) \implies \frac{g(n)}{f(n)}=O(n)] Find two functions $math[f] and $math[g] such that: * $math[f(n) = O(g^2(n))] * $math[f(n) = \omega(\log(g(n)))] * $math[f(n) = \Omega(f(n)\cdot g(n))] * $math[f(n) = \Theta(g(n)) + \Omega(g^2(n))]