====== - #3 Big Omega ======
The big Oh notation considered in the previous lesson is the most widely used asymptotic notation in computer science. The reason is that when designing a software system, we want to be sure that the algorithms we choose will be fast enough to solve our problems at the required input sizes, so we need to know some //upper bound// on how fast the program's running time $T(n)$ will grow as input size $n$ increases — and if we manage to prove that $T \in O(f)$, where $f$ is some function which doesn't grow too fast, then this is the kind of upper bound we need, since $T \in O(f)$ means that our algorithm's running time $T$ is guaranteed to grow //no more rapidly// than $f$ (for large inputs and up to a constant).
However, sometimes we are interested in the //lower bound// of an algorithm's running time, i. e. we would like to express the statement that the running time $T$ grows //no less rapidly// than some function $g$. An example is an encryption scheme: to feel safe when our documents are encrypted, we want to have some guarantee that a hacker, not knowing the password, will spend on average //at least// time of the order of magnitude $g(n)$ to break the encryption by brute-force attack, where $n$ is the length of the password and $g$ is some function (one can say that an encryption scheme is good if one can prove this for a function $g$ which grows rather fast). This calls for another asymptotic notation: the Big Omega.
**Definition (Big Omega).**
Let $f$ and $g$ be positive functions. We say that **$f$ grows asymptotically no slower than $g$** and write **$f\in \Omega(g)$** if there is a constant $c\geq 0$ such that for
all sufficiently large $n$, we have that $f(n)\geq c\cdot g(n)$. (Note:
It says $\geq$ here, not $\leq$.)
{{:asymptotics:Big_Omega.png|size=200}}
**Figure (Big Omega)**. $f\in \Omega(g)$ means that there exist some constants $c$ and $n_0$ such that for all $n > n_0$, $f(n) \geq cg(n)$.
An example of a statement involving $\Omega$ is about the complexity
of sorting functions. It can be shown that any sorting algorithm (that
makes black-box comparisons) needs to perform at least $\log n!$
comparisons for sorting a list of $n$ elements.((The argument
is actually very simple: Each comparison gives at most one bit of
information. There are $n!$ possible permutations of the list. To
identify which of the permutations needs to be applied, we thus need
$\log n!$ bits, hence $\log n!$ comparisons.)) Using big Omega
notation, we can just write: any sorting algorithm has a worst case
number of comparisons of $\Omega(\log n!)=\Omega(n\log n)$. (The last
equality can be shown using the Stirling formula.) We cannot write
$O(n\log n)$ here, because there are some sorting algorithms that take
much more than $n\log n$ comparisons (e.g., bubble sort).
Analogously to Big Oh, for determining whether $f \in \Omega(g)$ it is often useful to consider the ratio of the functions. Namely, the following criterion holds.
Let $f$ and $g$ be positive functions. If $\lim_{n \to \infty} \frac{f(n)}{g(n)}$ exists, then $f \in \Omega(g)$ if and only if $\lim_{n \to \infty} \frac{f(n)}{g(n)} > 0$.
The notions Big Oh and Big Omega are also tightly related:
$f \in \Omega(g)$ if and only if $g \in O(f)$.
Which of the statements in the exercises for [[02_big_oh|big Oh]] are
true if we replace $O$ by $\Omega$?
++Answer| $n \in \Omega(2n+1)$, $2n+1 \in \Omega(n)$, $3n \not\in \Omega(2n^2)$, $2n^2 \in \Omega(3n)$, $2 \in \Omega(1)$, $\log_2 n \not\in \Omega(n)$, $n \in \Omega(\log_2 n)$, $n^{23} \not\in \Omega(n^{24})$, $n^{23} \in \Omega(n^{22})$, $n^{23}+n^3 \in \Omega(n^{23}+4)$, $12034273842374923+n^{23}+10+n^3\in \Omega(12034273842374923)$, $n \not\in \Omega(3^n)$, $3^n \in \Omega(n)$. ++
All of the properties listed in [[02_big_oh#lemma_bigoh_prop|Lemma (Some properties of big Oh)]] also hold
for $\Omega$ instead of $O$ with small modifications. What are the
modifications that need to be done in the lemma to
make it hold for $\Omega$?
Alice tells Bob: I developed a new sorting algorithm with running
time $\Omega(\sqrt n)$. Bob is not impressed. Why?
++++Solution|
Unlike the Big Oh, the Big Omega is a lower bound. For example, if the running time of the algorithm is $T(n) = n^2$ or $T(n) = n^{100}$ or anything larger then $T(n) \in \Omega(\sqrt n)$. So Alice's statement only gives a guarantee on how much time her algorithm spends //at least//. This is no big deal since it is easy to create an algorithm that spends at least a given amount of time (e. g. simply put a ''sleep'' function call at the end).
++++
[[asymptotics:04_multiple_variables]]
----------------------------------------------------------------------------------------------------------------------------------------