Asymptotic Notation Examples
CSE-250 Fall 2022 - Section B
Sept 14, 2022
Textbook: Ch. 7.3-7.4
Big-_ Notation Recap
- Big-ϴ
- Growth Functions in the same complexity class.
- If $f(n) \in \Theta(g(n))$, then an algorithm that takes $f(n)$ steps is exactly as fast as one that takes $g(n)$ steps.
- Big-O
- Growth Functions in the same or smaller complexity class.
- If $f(n) \in O(g(n))$, then an algorithm that takes $f(n)$ steps is as fast as or faster than one that takes $g(n)$ steps.
- Big-Ω
- Growth Functions in the same or bigger complexity class.
- If $f(n) \in \Omega(g(n))$, then an algorithm that takes $f(n)$ steps is as slow as or slower than one that takes $g(n)$ steps.
Common Runtimes
- Constant Time: $\Theta(1)$
- e.g., $T(n) = c$ (runtime is independent of $n$)
- Logarithmic Time: $\Theta(\log(n))$
- e.g., $T(n) = c\log(n)$ (for some constant $c$)
- Linear Time: $\Theta(n)$
- e.g., $T(n) = c_1n + c_0$ (for some constants $c_0, c_1$)
- Quadratic Time: $\Theta(n^2)$
- e.g., $T(n) = c_2n^2 + c_1n + c_0$
- Polynomial Time: $\Theta(n^k)$ (for some $k \in \mathbb Z^+$)
- e.g., $T(n) = c_kn^k + \ldots + c_1n + c_0$
- Exponential Time: $\Theta(c^n)$ (for some $c \geq 1$)
Constant-Factor Speedups
for(i ← 0 until n) { /* do work */ }
$c$ and $n_0$
Compare $T_1(n) = 100n$ vs $T_2(n) = n^2$
- $100n = O(n^2)$ ($T_2$ is the slower runtime)
- ... but $c_{high} = 1$, $n_0 = 100$
- Until inputs of size 100 or more, $T_2$ is the faster runtime
$c$ and $n_0$
Asymptotically slower runtimes can be better.
- An algorithm with runtime $T_2$ is better on small inputs.
- An algorithm with runtime $T_2$ might be easier to implement or maintain
- An algorithm with runtime $T_1$ might not exist.
- (sometimes we can prove this, see CSE 331)
... but from now on, if $T_2(n)$ is in a bigger complexity class, then $T_1(n)$ is better/faster/stronger.
Bubble Sort
bubblesort(seq: Seq[Int]):
1. n ← seq length
2. for i ← n-2 to 0, by -1:
3. for j ← i to n-1:
4. if seq(j+1) < seq(j):
5. swap seq(j) and seq(j+1)
What is the runtime complexity class of Bubble Sort?
Summation Rules
- $\sum_{i=j}^{k}c = (k - j + 1)c$
- $\sum_{i=j}^{k}(cf(i)) = c\sum_{i=j}^{k}f(i)$
- $\sum_{i=j}^{k}(f(i) + g(i)) = \left(\sum_{i=j}^{k}f(i)\right) + \left(\sum_{i=j}^{k}g(i)\right)$
- $\sum_{i=j}^{k}(f(i)) = \left(\sum_{i=\ell}^{k}(f(i))\right) - \left(\sum_{i=\ell}^{j-1}(f(i))\right)$ (for any $\ell < j$)
- $\sum_{i=j}^{k}f(i) = f(j) + f(j+1) + \ldots + f(k-1) + f(k)$
- $\sum_{i=j}^{k}f(i) = f(j) + \ldots + f(\ell - 1) + \left(\sum_{i=\ell}^k f(i)\right)$ (for any $j < \ell \leq k$)
- $\sum_{i=j}^{k}f(i) = \left(\sum_{i=j}^{\ell}f(i)\right) + f(\ell+1) + \ldots + f(k)$ (for any $j \leq \ell < k$)
- $\sum_{i=1}^{k}i = \frac{k(k+1)}{2}$
- $\sum_{i=0}^{k}2^i = 2^{k+1}-1$
- $n! \leq c_sn^n$ is a tight upper bound (Sterling: Some constant $c_s$ exists)
Bubble Sort
bubblesort(seq: Seq[Int]):
1. n ← seq length
2. for i ← n-2 to 0, by -1:
3. for j ← i to n-1:
4. if seq(j+1) < seq(j):
5. swap seq(j) and seq(j+1)
Note: We can ignore the exact number of steps required by any step in the algorithm, as long as we know its complexity
Can we safely say this algorithm is $\Theta(n^2)$?
Bubble Sort (for Mutable Sequences)
def sort(seq: mutable.Seq[Int]): Unit =
{
val n = seq.length
for(i <- n - 2 to 0 by -1; j <- i to n)
{
if(seq(n) < seq(j))
{
val temp = seq(j+1)
seq(j+1) = seq(j)
seq(j) = temp
}
}
}
Bubble Sort (for Immutable Sequences)
def sort(seq: Seq[Int]): Seq[Int] =
{
val newSeq = seq.toArray
val n = seq.length
for(i <- n - 2 to 0 by -1; j <- i to n)
{
if(newSeq(n) < newSeq(j))
{
val temp = newSeq(j+1)
newSeq(j+1) = newSeq(j)
newSeq(j) = temp
}
}
return newSeq.toList
}
Searching Sequences
def indexOf[T](seq: Seq[T], value: T, from: Int): Int =
{
for(i <- from until seq.length)
{
if(seq(i).equals(value)) { return i }
}
return -1
}
Searching Sequences
def count[T](seq: Seq[T], value: T): Int =
{
var count = 0;
var i = indexOf(seq, value, 0)
while(i != -1)
{
count += 1;
i = indexOf(seq, value, i+1)
}
return count
}
Searching Sorted Sequences
... with $O(1)$ access to elements ('random access')
- To search $[begin, end)$:
- compare $target$ to $seq[middle]$ ($middle = \frac{end+begin}{2}$
- If $seq[middle] = target$ return $middle$
- If $target < seq[middle]$ search $[begin, middle)$
- If $seq[middle] < target$ search $[middle+1, end)$
- If $begin = end$, the value doesn't exist.
What if no random access is available?