Approximate overview
Why does Microsoft Bookins send me scheduling appointments in PDT?
I have no idea.
Pay attention to the timezone it says
How do we meet with you on Teams?
Wait until your appointment.
I should DM (Chat) you a message like, “Are you ready to meet?”
Respond.
I will then click the magic camera button to connect.
Note: Sometimes meetings run over. If I am not available, just DM me and say “Aren’t we meeting?” or something like that.
What if we don’t have an appointment?
DM me to see if I’m available or to ask a question.
What’s a reasonable time to wait before asking again.
For normal people, two days.
For me, twelve hours.
Or note-ation?
What do we call it
If f(n) is in O(g(n)) and g(n) is in O(h(n)), then f(n) is in O(h(n))if f(n) is in O(n^2) and n^2 is in O(n^3), then f(n) is in O(n^3)if f(n) is in O(10n + n^2), then f(n) is in O(n^2)If f(n) is in O(g(n) + h(n)) and g(n) is in O(h(n)), then f(n) is in O(h(n))Work as a group on a proof. Look back at how we prove such things.
Reminder: f(n) is in O(g(n)) iff there exist c>0 and n0>0 s.t. f(n)<=c*g(n) for all n > n0.
Normally:
Note: We want a and m s.t., f(n) <= a*h(n) for all n > m.
There exist c0 and n0 s.t., f(n) <= c0*(g(n) + h(n)) for all n > n0.
By definition of Big-Oh and first condition.
There exist c1 and n1 s.t., g(n) <= c1*h(n) for all n > n1.
By definition of Big-Oh and second condition.
If f(n) <= c0*(g(n) + h(n)) and g(n) <= c1*h(n) then
f(n) <= c0*(c1*h(n) + h(n)) for all n > n1,n0
By transitivitity of <= (and some other algebra
f(n) <= (c0*c1 + c0)*h(n) for all n > n1,n0
By the distributive property of mulitplcation over addition
Let d=c0*c1 + c0. Let m=max(n1,n0)
f(n) <= d*h(n) for all n > m.
Substitution
f(n) is in O(h(n))
by the definition of big-Oh
If f(n) is in O(cg(n)) for some c > 0, then f() is in O(g(n))Suppose f(n) is in O(3n + 5). Prove that f(n) is in O(n).
f(n) is in O(3n+5) implies f(n) is in O(3n).
See lemma entitled “You can ignore lower-order terms.”
f(n) is in O(3n) implies f(n) is in O(n)
See lemma entitled “You can ignore constant multipliers.”
Q.E.D.
There exist c, n0 s.t., f(n) <= c*(3n+5) for all n>n0.
By definition of big-O.
f(n) <= 3cn+5c for all n > n0
By algebra
f(n) <= 3cn + 5c <= 3cn + cn if n > 5 (and n > n0)
f(n) <= 4cn if n > 5 (and n > n0)
Let d = 4c. Let n1 = max(5,n0)
f(n) <= d*n for all n > n1
f(n) is in O(n)
Why do we have Big-Oh notation?
For iterative algorithms, we often look at (analyze) loop behavior.
Conveniently, the bound on this algorithm should be the same as the result.
int loopZero(int n) {
int result = 0;
for (int i = 1; i <= n; i++) {
for (int j = 1; j <= n; j++) {
result += 1;
} // for j
} // for i
return result;
} // loopZero
loopZero is in O(n^2)
For nested loops, the easiest (although not necessarily most accurate) strategy is to count the worst case of the number of repetitions of the outer loop, the worst case of the inner loop, and multiply them together.
In this case, the outer loop should run n times, and the inner loop will run n times, so this should be n^2.
Let’s change the outer loop to double i, rather than increment it.
int loopOne(int n) {
int result = 0;
for (int i = 1; i <= n; i *= 2) {
for (int j = 1; j <= n; j++) {
result += 1;
} // for j
} // for i
return result;
} // loopOne
How does the result differ?
loopOne is in O(…)
The outer loop runs no more than n times, since doubling is no worse than adding 1.
The inner loop runs n times.
So, it’s O(n^2)
We can get a tighter bound.
The outer loop runs no more than func(n) times, where func(n) is “the number of times I have to double 1 before I exceed or equal n” That function is log2(n).
The inner loop runs n times.
This algorithm is O(n*log2(n))
One of your homework assignments is to determine whether O(log(n)) == O(log2(n)).
Let’s change the inner loop to go up to i rather than n.
int loopTwo(int n) {
int result = 0;
for (int i = 1; i <= n; i *= 2) {
for (int j = 1; j <= i; j++) {
result += 1;
} // for j
} // for i
return result;
} // loopTwo
Idea one: Outer loop runs log2(n) times. When i is n, inner loop runs n times. So, we still have O(nlog2(n)).
Idea two: Tease apare the loop
1 + 2 + 4 + 8 + 16 + 32 + 64 + 128 + … + 2^k
Where 2^k = n.
That sum ends up being 2^(k+1) - 1
I probably stole this example from Skienna.
function loopThree(n)
r := 0
for i := 1 to n do
for j := i + 1 to n do
for k := j to i+j to n do
r := r + 1
return(r)