CSC 152 2005F, Class 34: Algorithm Analysis (2)
Admin:
* Walker observes
* About CSC295
* Homework: Lab on Algorithm Analysis
Overview
* Review
* Big-O Notation
* Eliminating constants
* Using the notation: Relating functions
* Analyzing recursive methods
About the HW
* Counting by hand is annoying
* Prediction for "slow" version: If we double the size, we should multiply the time by 4
* Prediction didn't necessarily match. Why?
* Counting by hand is approximate
* Doesn't account for time to create vector
* We're modeling, and models don't always work
* Our prediction could have been wrong
Review:
* We're analyzing the running speed of algorithms
* We can approximate the running speed of algorithms by "counting"
steps statically
* We can also try experiments
* Constants give us headaches
A formal approach
* Can we come up with a mathematical notation that helps us model the running time?
* And helps us ignore the stuff we don't care about
* We care about *an upperbound* on *the shape of the curve*
when *input is large enough*
* Plus get rid of those annoying constants
* Notation: Big-O
O(f(n)) is a "set" of all functions bounded above by f(n) (ignoring constant multiplers)
g(n) is in O(f(n)) iff exists d, n0 s.t.
for all n > n0
|g(n)| <= |d*f(n)|
Prove that g(n) = 2n^2 is O(n^2)
Let n0 be 1
Let d be 3
2n^2 vs. 3*n^2: 2n^2 < 3n^2
Consider g(n) = 100n vs f(n) = n^2
For n = 5, g(n) = 500, f(n) = 25
For n = 10, g(n) = 1000, f(n) = 100
Let n0 = 100
Let d = 1
g(n) = 100*n
f(n) = n*n > n*100, so f(n) > g(n)
Consider f(n) = 100n vs g(n) = n^2
If g(n) in O(f(n))?
Look at the shape. It's in O(???)
If f(n) is in O(h(n)) and g(n) is in O(h(n))
What do we know about f(n)+g(n)?
Bounded above by h(n) ; in O(h(n))
If f(n) is in O(g(n)) and g(n) is in O(h(n))
f(n) is in O(h(n))
f(n) + g(n) is in O(g(n))
We get to throw away lower-order terms when using Big-O analysis
running time is c*n^2 + d*n + e
* BigO basics say "throw away constants"
n^2 + n + 1
"The running time is in O(n^2 + n + 1)"
n^2 + n + 1 is in O(n^2) because n is in O(n^2) and 1 is in O(n^2)
What about analyzing recursive functions?
(define sum
(lambda (x)
(if (null? x) // 1
0 // 1
(+ // 1
(car x)// 1
(sum // The running time of sum. Whoops.
(cdr x)))))) // 1
* Hand waving. We look at each element once. We do a constant amount of work for each element. Hence, it's in O(n)
* Code rewrite. Rephrase the recursive as iteration and use normal counting
* Recurrence analysis
t(n) = "the amount of time sum takes on an input of size n"
t(0) = c
t(n) = d + t(n-1)
// Technique: Repeatedly expand right-hand-side and look for
// patterns
t(n) = d + d + t(n-2) = 2d + t(n-2)
t(n) = 2d + d + t(n-3) = 3d + t(n-3)
t(n) = 3d + d + t(n-4) = 4d + t(n-4)
t(n) = kd + t(n-k)
When n-k = 0, k = n
t(n) = n*d + t(0) = n*d + c in O(n)