Oct 26, 2022
$$\left(\sum_{i=1}^n O(\log(i))\right) + \left(\sum_{i=1}^n O(\log(n-i))\right)$$
$$< O(n\log(n))$$
After updating current, fixUp or fixDown.
If new current > parent, current moves up.
If new current < parent, current moves down.
How do we know where the value appears in the heap?
Input: Array
Output: Array reorderd to be a heap
$$O\left(\sum_{i = 1}^{\log(n)} \frac{n}{2^{i}} \cdot (i+1) \right)$$
$$O\left(n \sum_{i = 1}^{\log(n)} \frac{i}{2^{i}} + \frac{1}{2^i}\right)$$
$$O\left(n \sum_{i = 1}^{\log(n)} \frac{i}{2^{i}}\right)$$
$$O\left(n \sum_{i = 1}^{\infty} \frac{i}{2^{i}}\right)$$
$\sum_{i = 1}^{\infty} \frac{i}{2^{i}}$ is known to converge to a constant.
$$O\left(n\right)$$
We can go from an unsorted array to a heap in $O(n)$
(but heap sort still requires $n \log(n)$ for dequeueing)
An unordered collection of unique elements.
An unordered collection of non-unique elements.
Property | Seq | Set | Bag |
---|---|---|---|
Explicit Order | ✓ | ||
Enforced Uniqueness | ✓ | ||
Iterable | ✓ | ✓ | ✓ |
class TreeNode[T](
var _value: T,
var _left: Option[TreeNode[T]]
var _right: Option[TreeNode[T]]
)
class Tree[T] {
var root: Option[TreeNode[T]] = None // empty tree
}
trait Tree[+T]
case class TreeNode[T](
value: T,
left: Tree[T],
right: Tree[T]
) extends Tree[T]
case object EmptyTree extends Tree[Nothing]
def printTree[T](root: ImmutableTree[T], indent: Int) =
{
root match {
case TreeNode(v, left, right) =>
print((“ “ * indent) + v)
printTree(left, indent + 2)
printTree(right, indent + 2)
case EmptyTree =>
/* Do Nothing */
}
}
The height of a tree is the height of the root.
def height[T](root: Tree[T]): Int =
{
root match {
case EmptyTree =>
0
case TreeNode(v, left, right) =>
1 + Math.max( height(left), height(right) )
}
}
A Binary Tree over where each node stores a unique key, and a value's keys are ordered.
$X_1$ partitions its children.
Goal: Find an item with key $k$ in a BST rooted at root
def find[V: Ordering](root: BST[V], target: V): Option[V] =
root match {
case TreeNode(v, left, right) =>
if(Ordering[V].lt( target, v )){ return find(left, target) }
else if(Ordering[V].lt( v, target )){ return find(right, target) }
else { return Some(v) }
case EmptyTree =>
return None
}
What's the complexity? (how many times do we call 'find'?) $O(d)$
Goal: Insert an item with key $k$ in a BST rooted at root
def insert[V: Ordering](root: BST[V], value: V): BST[V] =
node match {
case TreeNode(v, left, right) =>
if(Ordering[V].lt( target, v ) ){
return TreeNode(v, insert(left, target), right)
} else if(Ordering[V].lt( v, target ) ){
return TreeNode(v, left, insert(right, target))
} else {
return node // already present
}
case EmptyTree =>
return TreeNode(value, EmptyTree, EmptyTree)
}
What's the complexity? $O(d)$
Goal: Remove the item with key $k$ from a BST rooted at root
What's the complexity? $O(d)$
Operation | Runtime |
---|---|
find | $O(d)$ |
insert | $O(d)$ |
remove | $O(d)$ |
What's that in terms of $n$? $O(n)$
Does it need to be that bad?
How do we implement bags?
Idea 1: Just allow multiple copies ($X_L \leq X_1$)
Idea 2: One copy, but store a count
trait Tree[+K, +V]
case class TreeNode[K, V](
key: K,
value: V,
left: Tree[K, V],
right: Tree[K, V]
) extends Tree[K, V]
case object EmptyTree extends Tree[Nothing, Nothing]
Balancing Trees