Multiple variables

The asymptotic notations can be extended to the case where the functions depend on multiple variables. E. g. if we have an algorithm for searching a string of length $m$ in a text of length $n$, we might be interested in how the worst-case running time $T(m,n)$ behaves when both the search string and the text get very long (i. e., when $m$ and $n$ both grow to infinity).

The definitions of asymptotic notations are extended to the multivariate case in a natural way. E. g. the definition of Big Oh for two variables going to infinity sounds like this:

Definition (Big Oh for functions of two variables). Let $f$ and $g$ be positive functions of two variables. We say that $f$ grows asymptotically no faster than $g$ and write $f\in O(g)$ if there exist constants $c$, $m_0$ and $n_0$ such that $f(m, n)\leq c\cdot g(m, n)$ whenever $m \geq m_0$ and $n \geq n_0$.

For example, if $T(m, n) = 4mn$, $g(m, n) = mn^2$ and $h(m, n)=n^3$, then $T \in O(g)$ but $T \not\in O(h)$.

Whenever multiple variables are involved, one must to be careful to make clear which variables are function arguments that go to infinity, and which are constants: for instance, the statement $nk\in O(n)$ is ambiguous, if we don't know whether $k\to \infty$ or $n\to\infty$ or both. If $n \to \infty$ and $k$ is a constant, then the statement $nk\in O(n)$ is true; if $n$ and $k$ both go infinity or only $k$ goes to infinity, then it is false.

asymptotics/04_multiple_variables.txt ยท Last modified: 2014/01/20 11:09 by marje