Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
aa:lab:05 [2020/11/05 23:30] calin_andrei.bucur |
aa:lab:05 [2020/11/15 13:11] (current) fabianpatras |
||
---|---|---|---|
Line 1: | Line 1: | ||
- | ====== Lab 05 - Asymptotic notations ====== | + | ====== Lab 06 - Asymptotic notations ====== |
Line 12: | Line 12: | ||
==== 2. Properties of asymptotic notations ==== | ==== 2. Properties of asymptotic notations ==== | ||
- | **2.1** Prove that if $math[lim] $math[g(n) \over f(n) = 0] implies that $math[g(n) \in o(f(n))], for n reaching infinity. Hint: use the "epsilon" or "Cauchy" limit definition for sequences. | + | **2.1** Prove that if $math[lim_{n\rightarrow \infty}] $math[g(n) \over f(n)] $math[= 0] implies that $math[g(n) \in o(f(n))], for n reaching infinity. Hint: use the "epsilon" or "Cauchy" limit definition for sequences. |
**2.2** Prove that $math[f(n) \in Ω(log(n))] and $math[g(n) \in O(n)] implies $math[f(n) \in Ω(log(g(n)))]. | **2.2** Prove that $math[f(n) \in Ω(log(n))] and $math[g(n) \in O(n)] implies $math[f(n) \in Ω(log(g(n)))]. | ||
Line 18: | Line 18: | ||
**2.3** Prove that $math[f(n) \in Ω(g(n))] and $math[g(n) \in O(n ^ 2)], then $math[g(n) \over f(n)] $math[\in O(n)]. | **2.3** Prove that $math[f(n) \in Ω(g(n))] and $math[g(n) \in O(n ^ 2)], then $math[g(n) \over f(n)] $math[\in O(n)]. | ||
- | **2.4** $math[O(n) \cap Ω(n) = Θ(n)] ? | + | **2.4** $math[O(n) \cap |
+ | Ω(n) = Θ(n)] ? | ||
+ | |||
+ | **2.5** $math[O(n) = Θ(n) \cup o(n)] ? | ||
+ | |||
+ | ==== 3. Execution time vs. Asymptotic growth ==== | ||
+ | |||
+ | **3.1** Install gnuplot. | ||
+ | |||
+ | **3.2** Use the command ''set xrange [0:]'' to set the plotting interval to [0,infty). | ||
+ | |||
+ | Plot the function |$math[sin(n)*n]|, using the command: ''plot abs(sin(x)*x)''. ''abs'' stands for absolute value. | ||
+ | |||
+ | Use the command ''plot abs(sin(x)*x), x'' to compare the two functions. | ||
+ | |||
+ | **3.3** Write the following script file: | ||
+ | |||
+ | ########################### | ||
+ | # plotting script example | ||
+ | ########################### | ||
+ | |||
+ | reset | ||
+ | set xrange [0:] | ||
+ | |||
+ | f(x) = abs(sin(x)*x) | ||
+ | g(x) = x | ||
+ | |||
+ | plot f(x),g(x) | ||
+ | |||
+ | and save it as "example.plot". | ||
+ | |||
+ | Run gnuplot, and use the command ''load "example.plot"'' to run the script. | ||
+ | |||
+ | **3.4** Add the following data file "data.dat" to the directory: | ||
+ | |||
+ | 1 2 | ||
+ | 2 3 | ||
+ | 3 4 | ||
+ | 4 5 | ||
+ | 5 6 | ||
+ | 6 7 | ||
+ | 7 8 | ||
+ | 8 9 | ||
+ | |||
+ | And modify the last command to: | ||
+ | |||
+ | ''plot "data.dat",f(x),g(x)'' | ||
+ | |||
+ | **3.5** Write a function which computes the greatest common divisor of two numbers. | ||
+ | |||
+ | **3.6** What is the worst-case for the gcd algorithm? | ||
+ | |||
+ | **3.7** Write a function which computes fibonacci numbers. | ||
+ | |||
+ | **3.8** Start from the following code: | ||
+ | |||
+ | from time import time | ||
+ | |||
+ | |||
+ | l = [80000, 73201, 66003, 60000, 50000, 40000, 30000, 20000, 10000, 5000] | ||
+ | |||
+ | |||
+ | begin_time = time() | ||
+ | #[test code] | ||
+ | duration = time() - begin_time | ||
+ | |||
+ | and write a procedure which determines the running times for gcd, on the x-th and (x-1)th fibonacci numbers, from the list l. | ||
+ | |||
+ | The output of the script should be a list of values: | ||
+ | <code> | ||
+ | a b | ||
+ | </code> | ||
+ | |||
+ | where ''a'' is the **size of the input**, expressed as a natural number and ''b'' is the running time for gcd. | ||
+ | |||
+ | For instance, for x = 10000, ''b'' will be the running time of gcd, on the 10000th and 9999th fibonacci number. ''a'' will be the **size** of the input, i.e. the number of bits required to store the value $math[fibo(x) + fibo(x+1)], rounded to a natural number. | ||
+ | |||
+ | **3.9** Write a gnuplot script to plot the data collected from the python script, and compare it to the **linear** function. Comment the results. How does the execution time grow? Find the suitable function. |