====== - #6 The negligible, the noticeable and the overwhelming ======
In this lesson we will meet three concepts related to the asymptotic notions that are especially often used in cryptography: the concepts of //negligible//, //overwhelming// and //noticeable// functions. This lesson assumes some basic knowledge of the concept of probability, as covered by lesson [[:probability:02_multiple_event_probability]] in the Probability theory topic.
===== - Negligible functions =====
Suppose we have a probabilistic algorithm $A$ which solves a problem with probability at least $p > 0$ and reports an error if it fails. The failure probability $q = 1-p$ might be too high for our requirements. However, we would be happy if the failure probability were below some practical threshold, like $2^{-90}$, which is less than the probability of our building being demolished by a meteorite during the computation. We can easily make the failure probability smaller by calling the algorithm again if it failed. Let $A^*$ be an algorithm which takes an input parameter $k$ and calls algorithm $A $up to $k$ times. Then the probability that $A^*$ will fail is $P_{\text{fail}}^{A^*}(k) = q^k$.
For example, when generating an RSA key, one needs to solve the problem of generating a random prime number of given length; we can do this by generating a random number and checking it with some primality test — if the generated number passes the test, then it is the answer we wanted; if the test fails, then we try again and again by generating new random numbers.
Is this a good idea or is an unreasonably large number of calls needed to get a failure probability as small as desired? For concreteness, let $p = \frac12$. Then $q^k = 2^{-k}$. To achieve the very small failure probability $2^{-90}$, we only have to call the algorithm $A$ for $90$ times, which is not very much. For different $p$, the results are similar. So $A^*$ has rather good performance; this is actually the basic idea used for generating RSA keys in practice.
Now suppose we have another algorithm — algorithm $B$ — which accepts some input parameter $k$, runs in time proportional to $k$ (i. e. in time $ck$ for some constant $c$) and fails with probability $P_{\text{fail}}^B(k) = \frac1k$. If we cannot make algorithm $B$ succeed with a higher probability by running it multiple times, like we did for $A$ (because $B$ is not probabilistic or because it doesn't tell whether it failed), the only way to decrease its probability of failure is to increase the parameter $k$. However, to achieve failure probability $2^{-90}$, we would have to make $k = 2^{90}$ — giving a running time vastly longer than anybody could wait even if, in the running time estimate $ck$, the constant $c$ were equal to only 1 CPU cycle).
So, decreasing the failure probability of algorithm $B$ is much harder than that of $A^*$ because the function $\frac1k$ decreases much slower than the function $2^{-k}$.
{{:asymptotics:inverse_and_exponential.png}}
**Figure.** The function $\frac1k$ decreases much slower than the function $2^{-k}$.
Functions $\frac1{k^i}$, where $i$ is constant, are not much better than $\frac1k$. For instance, if the failure probability is $P_{\text{fail}}^B(k) = \frac{1}{k^2}$, then we must take $k = 2^{45}$ to achieve failure probability $2^{-90}$. Although the latter might even be practically computable, it is still a very long time compared to $90$ calls of algorithm A.
As before, we use the gap between polynomials and the exponential function as a clear cut-off point. The algorithm $A^*$ is good because its failure probability $P_{\text{fail}}^{A^*}(k)$ approaches zero faster than reciprocal of any polynomial. This leads to the following definition of //negligible// probabilities.
**Definition.** A positive function $\mu(n)$ is called **negligible**, if for any positive polynomial $p(n)$ there exists a constant $n_0$ such that for all $n\geq n_0$ we have $\mu(n)<1/p(n)$.
We can restate the definition in alternative form.
**Lemma.** A positive function $\mu(n)$ is negligible if and only if $\mu(n) \in O(1/{p(n)})$ for all positive polynomials $p$.
++++Proof|
We have to prove two things:
* necessity, i. e. that every negligible function belongs to $O(1/{p(n)})$ for all positive polynomials $p$;
* sufficiency, i. e. that any function, which belongs to $O(1/{p(n)})$ for all $p$, is negligible.
Necessity follows directly from the definition of Big Oh.
For sufficiency, let $f$ be such a function and $p$ be a positive polynomial. We have to show that there exists a constant $n_0$ such that $\mu(n)<1/p(n)$ for all $n\geq n_0$. Note that $np(n)$ is also a positive polynomial, so by our assumption $f(n) \in O(\frac1{np(n)})$. Hence, by the definition of Big Oh, there exist constants $c$ and $n_1$ such that $f(n) \leq \frac{c}{np(n)}$ for all $n \geq n_1$. Now we can take $n_0 = \max \{ n_1,\ c+1\}$: indeed, then for all $n \geq n_1$, $f(n) \leq \frac{c}{np(n)} \leq \frac{c}{(c+1)p(n)} < \frac1{p(n)}$.
++++
In other words, a positive function is negligible if and only if it grows asymptotically no faster than the reciprocal of any positive polynomial.
Yet another equivalent condition is given by the following lemma.
**Lemma.** A positive function $\mu(n)$ is negligible if and only if for any positive polynomial $p(n)$ the product $p(n)\mu(n)$ converges to zero.
++++Proof|
We have to prove two things:
* necessity, i. e. that if $\mu$ is negligible then $p(n)\mu(n) \to 0$ for all positive polynomials $p$;
* sufficiency, i. e. that any function satisfying $p(n)\mu(n) \to 0$ for all positive polynomials $p$ is negligible.
Let us recall the property of Big Oh that $f \in O(g)$ if $\lim_n \frac{f(n)}{g(n)} < \infty$. From this property it is clear that if $p(n)\mu(n) \to 0$, then $\mu(n) \in O(1/{p(n)})$. So, using the previous lemma, sufficiency is obvious.
For the necessity part of the proof, assume that $\mu(n)$ is negligible and $p(n)$ is a positive polynomial. Then $np(n)$ is also a positive polynomial. By the definition of negligible function, there exists $n_0$ such that for all $n\geq n_0$, the inequality $\mu(n)<\frac{1}{p(n) n}$ holds, which implies
$$\mu(n)p(n)<\frac{1}{n}\enspace.$$
Q. E. D.
++++
In other words, even if the bad event with negligible probability would occur in polynomial number of cases the total probability would still approach zero for $n \to \infty$.
Actually, as the following lemma says, to prove that a function is negligible, it suffices to consider only monomials $n^{i}$ instead of all polynomials $p(n)$.
**Lemma.** A positive function $\mu(n)$ is negligible if and only if for any natural number $i$ the product $n^i\mu(n)$ converges to zero.
++++Proof|
Taking into account the previous lemma, we only have to prove that if $n^i\mu(n) \to 0$ for any constant $i \in \mathbb N$ then $p(n)\mu(n) \to 0$ for any polynomial $p$.
Assume that $\mu(n)n^i$ converges to zero for any $i$ and let $p$ be a positive polynomial. Since $p(n) \in O(n^d)$ where $d$ is the degree of $p$ (by [[02_big_oh#big_Oh_of_some_well_known_functions|Lemma (Big Oh relations of some well-known functions)]]), there exists a constant $c$ such that $p(n) \leq cn^d$ for sufficiently large $n$. Since $0 \leq p(n)\mu(n) \leq cn^d \mu(n)$ for sufficiently large $n$ and $cn^d \mu(n) \to 0$ by our assumption, it follows that $p(n)\mu(n) \to 0$.
QED
++++
For example, the function $P_{\text{fail}}^{A^*}(k) = 2^{-k}$ considered in the introduction is negligible since for any $i$, $\lim_{k \to \infty} \frac{k^i}{2^k} = 0$. On the other hand, the function $P_{\text{fail}}^{B}(k) = \frac1k$ is not negligible since, for example, because $\lim_{k \to \infty} k\frac1k = 1 \neq 0$.
In crypto, we consider probabilities that are negligible to be too small to matter. In other words, if an attack works only with negligible probability, then we consider that attack to be unsuccessful.
Which of the following statements is true? (Justify your answer.)
- $3^{-n}$ is negligible. ++Solution|True since $\lim_{k \to \infty} \frac{3^{-k}}{\frac1{k^i}} = 0$ for any $i > 0$.++
- $n^{-100}$ is negligible. ++Solution|False. Although $n^{-100}$ is very small, this is not a negligible function. For example, $\lim_n n^{123}n^{-100} = \lim n^{23} = \infty$.++
- $2^n$ is negligible. ++Solution|False. Indeed, the condition $\lim n^i 2^n \to 0$ already fails for the case $i = 0$ since the function $2^n$ is not bounded.++
- Consider a cryptosystem. On security parameter $\eta$, that scheme uses a block cipher with key length $\log \eta$. The adversary can (only) break the cryptosystem by guessing the key of the block cipher. Then the probability that the adversary breaks the cryptosystem is negligible in $\eta$. ++Solution|False. There are $2^{\log \eta} = \eta$ keys of length $\log\eta$ bits. Therefore, if the adversary tries a random key, the probability that this happens to be the actual key is $\frac1{\eta}$. The function $\frac1{\eta}$ is not negligible in $\eta$ as shown before.++
- The same, but with key length $\eta$ (instead of $\log\eta$). ++Solution|True. There are $2^\eta$ keys of length $\eta$ bits. Therefore, the probability that the adversary breaks the cryptosystem in one attempt is $\frac1{\eta}$. The function $2^{-\eta}$ is negligible, as shown before. (We assume that the adversary makes only one attempt since if we allowed him to make arbitrarily many attacks, he could achieve arbitrarily large success probability and the question would be pointless as the answer would obviously be false. However, if we allowed him to make a constant number of attempts or even a number of attempts polynomial in $\eta$, the answer would be the same as for one attempt; this follows from [[#lemma_negl_prop|Lemma (Some properties of negligible functions)]] below.)++
- Optional (more tricky): The same, but with key length $(\log\eta)^2$. ++Solution|True. There are $2^{(\log\eta)^2} = \left(2^{\log\eta}\right)^{\log\eta} = \eta^{\log\eta}$ keys of length $(\log\eta)^2$. Hence, the attacker's success probability is $\frac1{\eta^{\log\eta}}$. Let $i$ be an arbitrary natural number. We have to show that $\eta^i \eta^{-\log\eta} \to 0$. Note that since $\log\eta$ converges to $\infty$, there exists $\eta_0$ such that $\log\eta \geq i+1$ for $\eta \geq \eta_0$. Therefore, for sufficiently large $\eta$, $\eta^i \eta^{\log\eta} \leq \frac1{\eta}$, Q. E. D.++
Of course, it is somewhat arbitrary to say that negligible functions
are too small to matter, while, e.g., $n^{-80}$ (which is very very
small) is considered to matter. The reasons for using this particular
class of functions in crypto is that it behaves quite nicely under
various operations, this makes things easier in many situations. The
following lemma lists some of these nice properties of negligible
functions:
** Lemma (Some properties of negligible functions).**
- If $f_1$ and $f_2$ are negligible, then $f_1+f_2$ is negligible. ++Proof|If, for any positive polynomial $p$, $p(n)f_1(n) \to 0$ and $p(n)f_2(n) \to 0$, then $\lim p(n)(f_1(n) + f_2(n)) = \lim p(n)f_1(n) + \lim p(n)f_2(n) = 0 + 0 = 0$ as well.++
- If $f$ is negligible and $c>0$ is a constant, then $c\cdot f$ is negligible. ++Proof||If, for any positive polynomial $p$, $\lim p(n)f(n) = 0$, then $\lim cp(n)f(n) = c\lim p(n)f(n) = c \cdot 0 = 0$ as well.++
- If $f$ is negligible and $p$ is polynomially bounded (i.e., there is a positive polynomial $r$ that is bigger than $p$ for sufficiently large $n$), then $p\cdot f$ is negligible. ++Proof| Let $f$ be negligible. We have to show that for any positive polynomial $q$, the product $q(n)p(n)f(n)$ converges to $0$. Since $q(n)r(n)$ is a positive polynomial and $f$ is negligible, we know that $r(n)p(n)f(n) \to 0$; hence also $q(n)p(n)f(n) \to 0$ since $q(n)p(n)f(n) < r(n)p(n)f(n)$ starting for some $n$.++
- If $f$ is negligible, then for any $c>0$, $f^c$ is negligible. (This also includes cases like $f^{1/2}=\sqrt f$.) ++Proof|Let $f$ be negligible. Then $f(n) \leq 1$ starting from some point (this is obvious from the definition of negligible function if you take the polynomial there to be the constant polynomial $1$). So if $c \geq 1$ then the validity of the claim is rather obvious, since $f(n)^c \leq f(n)$ if $f(n) \leq 1$. It remains to prove that $f^c$ is negligible if $c < 1$. Let $p$ be a positive polynomial. We have to show that there exists $n_0$ such that $f(n)^c \leq 1/p(n)$ for all $n \geq n_0$. Let $d$ be a natural number such that $cd \geq 1$. Then $f^{cd}$ is negligible, as we just proved. Since $f^{cd}$ is negligible and $p(n)^d$ is a positive polynomial, there exists $m_0$ such that $f(n)^{cd} \leq 1/p(n)^d$ for all $n \geq m_0$. Taking the $d$-th root of the inequality reveals that we can let $n_0 = m_0$. Q.E.D.++
An adversary tries to attack a protocol. He can do two things: First, he can try //once// to guess the master key $k$. Second, he can try as often as he wishes to guess a key $k'$. You may assume that $2^{-|k|}$ and $2^{-|k'|}$ are negligible (i.e., a single try at guessing $k$ or $k'$ succeeds only with negligible
probability). The adversary wins if he guesses $k$ (in the first guess) or $k'$ (in any of the guesses).
Show that the probability that the adversary wins is negligible.
(Use [[#lemma_negl_prop|Lemma (Some properties of negligible functions)]]; remember also that $\Pr[A_1\vee\dots \vee A_n]\leq \Pr[A_1]+\dots+\Pr[A_n]$.)
++++Solution. | Let $p = 2^{-|k|}$ be the probability that the adversary guesses the key $k$ on a single try, and $p' = 2^{-|k'|}$ be the probability that the adversary guesses the key $k'$ on a single try. By saying that the probabilities $p$ and $p'$ are negligible, we mean that the key lengths $|k|$ and $|k'|$ are determined by some other variable $x$ (e. g. a parameter of the protocol) and the functions $p(x) = 2^{-|k(x)|}$ and $p'(x) = 2^{-|k'(x)|}$ are negligible (i. e. by increasing $x$ we can make the probabilities $p$ and $p'$ decrease quicker than the reciprocal of any polynomial of $x$).
Although the protocol allows the adversary to try to guess the key $k'$ as many times as he wishes, let us make a realistic assumption that each attack takes a finite amount of time and the attacker does not live forever, so there's a limit to the number of times he can afford to guess the key. Let $c$ be the maximal number of times the adversary can guess the key $k'$. We also suppose that $c$ is bounded by a constant. This is a realistic assumption, since making the key longer makes computations usually more time-consuming, so the number of times the attacker can afford to guess the key is no more than the number of times to guess he can afford to guess the key at the shortest possible key length. The probability that the adversary wins after trying to guess the master key and $c$ times trying to guess the key $k'$ is no more than $p + c \cdot p'$. Since $p$ and $p'$ are negligible with respect to the security parameter $x$ and $c$ is bounded, we can conclude from [[#lemma_negl_prop|Lemma (Some properties of negligible functions)]] that $p + c \cdot p'$ is negligible in $x$, too. Q. E. D.
Note that here the problem statement was somewhat ambiguous (but hopefully it made you think a little). If we had not made the assumption that the number of times the attacker can guess is bounded by some constant but would have allowed it to grow to infinity as the parameter $x$ is increased (e. g. suppose that $x$ represents time and although standards with longer key lengths are adopted over time, the attacker has an exponentially growing botnet at his disposal), then the winning probability would not necessarily have remained negligible at all. E. g. if $p' = 2^{-x}$ and $c = 2^x$, then the product $cp' = 1$ is not negligible any more, so the reasoning above is not valid (the success probability of guessing the key $k'$ is of course less than $cp' = 1$, but it can be shown to converge to $1 - \frac1{e} \approx 0.63$, so it is actually indeed not negligible). If you reached that conclusion, you were right, too.
++++
===== - Overwhelming functions =====
The next important class of functions in cryptography is that of
//overwhelming// functions. Intuitively, a
probability is overwhelming, if it is almost $1$. Formally:
A function $f$ is **overwhelming** if $1-f$ is negligible.
For example: If we wish to state that a certain message delivery
algorithm is secure against denial-of-service attacks, in a formal
definition we might require that the probability that the message is
delivered is overwhelming. (Which is the same as requiring that the
message is lost with negligible probability.)
In a good cryptosystem, which of the following is overwhelming?
- the probability that no successful attack takes place,
- the probability that a successful attack takes place.
++ Answer | 1. ++
===== - Noticeable functions =====
Let us remind the algorithms $A$ and $A^*$ we discussed when introducing the notion of negligibility. We analyzed how the failure probability of algorithm $A^*$ is related to its running time (which is proportional to the number of repetitions). But algorithm $A$ might also have some input parameter (e. g. the length of the prime number to be generated); let's call it $n$. We didn't consider how $n$ affects the running time of $A^*$ — which is fine if we are only interested in a particular value of $n$. But let's investigate now how the performance of $A^*$ scales when $n$ is increased (e. g., if we want to generate longer and longer RSA keys).
Let's denote the worst-case running time of algorithm $A$ with input parameter $n$ by $T_A(n)$. The failure probability of algorithm $A$ might depend on $n$ as well — let's denote it by $q(n)$.
The worst-case running time of algorithm $A^*$, i. e. the $k$-times repetition of algorithm $A$ with input parameter $n$, is then obviously $T_{A^*}(k,n) = kT_A(n)$, and the failure probability of $A^*$ is $P_{\text{fail}}^{A^*}(k,n) = q(n)^k$. To choose $k$, let's fix a threshold $\epsilon > 0$ (the same for all $n$) such that we are satisfied if $A^*$ has failure probability at most $\epsilon$. When running $A^*$ with input parameter $n$, we of course choose $k$ to be the smallest number such that $q(n)^k < \epsilon$. Denote this number by $k_0(n, \epsilon)$. So the running time of $A^*$ is $T_{A^*}(n) = k_0(n, \epsilon)T_A(n)$.
As we saw before, thanks to the fact that $P_{\text{fail}}^{A^*}(k,n)$ is negligible in $k$, i. e. decreases very fast when $k$ increases, the running time $T_{A^*}(n)$ grows slowly if we decrease $\epsilon$. However, that analysis didn't cover how $T_{A^*}(n)$ behaves with respect to $n$; this behavior may be not as good. If the success probability of algorithm A, $p(n) = 1-q(n)$, decreases very quickly with $n$, then the number of repetitions must also grow very quickly, so that algorithm $A^*$ may get very slow for large $n$, even if the time complexity of algorithm $A$ is small.
To be sure that the number of repetitions needed doesn't grow too fast with $n$, we need a //lower bound// on the probability $p(n)$ that would guarantee that $p(n)$ does not decrease too fast. This is in contrast with the concept of neglibility, which gave an //upper bound// on the failure probability $P_{\text{fail}}^{A^*}(k)$, guaranteeing that it converges to $0$ sufficiently fast. So we need a new asymptotic notion: if the function $p(n)$ doesn't get near $0$ too fast, it is called //noticeable//. The precise definition of //noticeable// is given below.
A positive function $f$ is **noticeable** if there exist a positive polynomial $p$ and a number $n_0$ such that $f(n)\geq 1/p(n)$ for all $n \geq n_0$.
Remark. //Using the definition of //noticeable// given above, our heuristic claim, that algorithm $A^*$ gets very slow for large input parameters $n$ unless algorithm $A$ has a noticeable success probability, can actually be made precise. Namely, it can be proven that if algorithm $A$ has polynomial time complexity, then algorithm $A^*$ has polynomial time complexity too if and only if the success probability of A, $p(n)$, is noticeable.//
Again, there are multiple alternative equivalent definitions:
Let $f$ be a positive function $f$. The following are equivalent:
* $f$ is noticeable;
* $f(n) \in \Omega(1/p(n))$ for some positive polynomial $p$;
* $p(n)f(n) \to \infty$ for some positive polynomial $p$;
* $n^i f(n) \to \infty$ for some $i \in \mathbb N$.
Consider the function $f(n):=1$ for even $n$ and $f(n):=2^{-n}$ for odd $n$. Is $f$ negligible? overwhelming? noticeable?
++++Solution.|
The function $f$ is neither negligible nor overwhelming nor noticeable.\\
\\
Indeed, $f$ is not negligible since every negligible function converges to $0$ (this follows e. g. from the negligibility condition $p(n) f(n) \to 0$ if one takes $p$ to be the constant polynomial $p(x) = 1$), but $f$ does not. The function $f$ is not overwhelming since an overwhelming function converges to $1$ (indeed, if $f$ is overwhelming then $1-f$ is negligible, whence $\lim f(n) = \lim 1-(1-f(n)) = \lim 1 - \lim (1-f(n)) = 1 - 0 = 1$).\\
\\
Finally, to show that the function $f$ is not noticeable, let's suppose the contrary and show that this will lead to a contradiction. If $f$ were noticeable then $n^i f(n) \to \infty$ for some $i \in \mathbb N$. In particular, then we would have $\lim_{n \to \infty} n^i 2^{-n} = \infty$, where the limit is taken over all odd natural numbers $n$. But this is false since it is a well known fact that $\lim_{n \to \infty} n^i 2^{-n} = 0$ for any $i$ (the latter limit holds even if we do not constrain $n$ to be an odd natural number but take the limit over all real numbers).
++++
A warning: the word **non-negligible** is sometimes used as a
synonym for //noticeable//, and sometimes to mean a function that is not
negligible. Be aware of this when you see the word //non-negligible//!