# Class 26: Algorithm Analysis (3)

Back to Algorithm Analysis (2). On to Project Discussion.

Held: Monday, 11 October 2004

Summary: Today we conclude our initial explorations of algorithm analysis with some further explorations of the notation.

Related Pages:

Notes:

• Security Occifer Motta says "Oh shit, what did we do wrong now?"
• Security Occifer Motta also says "Good luck, I'm thinking about you."
• Math/CS table, Cowles, noon.

Overview:

• Review
• The role of experimentation
• Using the notation: Eliminating constants
• Using the notation: Simplifying running times
• Analyzing recursive procedures

## Review

• f(n) is in O(g(n)) iff
• there exists a number n0
• there exists a number d > 0
• for all but finitely many n > n0, abs(f(n)) <= abs(d*g(n))
• This notation provides a convenient mechanism for talking about running time without worrying about too many potentially useless details.

## Experimental Analysis

• Particularly as you start to analyze algorithms, it may be helpful to analyze them experimentally as well as abstractly.
• What do I mean when I say analyze them experimentally? I mean that we can time them on a variety of inputs and graph the results.
• If the experimental and the abstract results match, we can be fairly confident in the abstract results.
• If they don't, we may need to reflect and try again.
• Our analysis may be correct and our implementation incorrect.
• Our analysis may be correct and our data may all be outliers.
• Our analysis may be incorrect.
• ...
• Note that some analyses can be very difficult.

## Eliminating Constants

• One of the nice things about asymptotic analysis is that it makes constants unimportant because they can be hidden in the d.
• If f(n) is 100n seconds and g(n) is 0.5n seconds, then
• f(n) is in O(g(n)) [let d be 200]
• g(n) is in f(n) [let d be 1]
• If f(n) is 100n seconds and g(n) is n2 seconds, then f(n) is in O(g(n)) [let n0 be 100 and d be 1]
• However, g(n) is not in O(f(n)). Why not?
• Suppose there were an n0 and a d.
• Consider what happens for n = 101d.
• d*f(n) = d*100*101*d = d*d*100*101.
• However, g(n) = d*d*101*101, which is even larger.
• If n0 is greater than 101d, we'll still have this problem [proof left to reader].
• Since constants can be eliminated, we normally don't write them.
• That is, we say that the running time of an algorithm is O(n) or O(n2) or ....

## Relating Functions

• Note that some of the counting can be hard, so we may also need to estimate throughout the process.
• We can often express the running time of an algorithm as a composition of other functions.
• For example, we might have a primary control structure which repeats its contents O(h(n)) times and within that control structure, we have a subalgorithm that takes O(g(n)) time.
• Similarly, we might break our algorithm up into two independent algorithms, one of which takes O(h(n)) time and one of which takes O(g(n)) time.
• We might also look at other relationships.
• Here are some questions we might ask ourselves about composition of functions.
• If f(n) is in O(h(n)) and g(n) is O(k(n)), then what can we say about f(n) + g(n)?
• If f(n) is in O(h(n)) and g(n) is O(k(n)), then what can we say about f(n) * g(n)?
• If f(n) is in O(g(n)) and g(n) is O(h(n)) then what can we say about f(n) and h(n)?
• If f(n) is in O(kg(n)) and k is a constant, then what else can we say about f(n) and g(n)?
• If f(n) is in O(g(n) + h(n)) and g(n) is in O(h(n)), then what simpler thing can we say about f(n)?

## Analyzing Recursive Procedures

• Unfortunately, the techniques for analyzing algorithms we discussed above focuse primarily on iterative algorithms. What about recursive algorithms?
• The process seems inherently more difficult.
• However, we can use a similar strategy.
• Count the number of recursive calls.
• Count the time spent during each recursive call (except for the time spent in subsequent recursive calls).
• How do you count the number of recursive calls? Often, the shrink function gives you a sense of how many recursive calls you need to do
• That is, how many calls to shrink does it take to reduce the argument to the base case.
• It is also possible to use the concept of recurrence relations to determine the running time of an algorithm.
• For example, if a recursive algorithm takes c steps and reduces the parameter by one, we might express the running time as a function, time(n) which represents the running time on input of n elements:
• time(n) = c + time(n-1)
• time(1) = c
• This equation is relatively easy to solve "intuitively".
• time(n) = c + time(n-1)
• = c + c + time(n-2)
• = ...
• = c + c + ... + time(1)
• = c + c + ... + c [n times]
• = c*n
• Other interesting recurrence relations:
• time(n) = n + time(n-1)
• time(n) = c n + time(n-1)
• time(n) = c + time(n/2)
• time(n) = n + time(n/2)
• time(n) = c * time(n-1)
• time(n) = c * time(n/2)

Back to Algorithm Analysis (2). On to Project Discussion.

Disclaimer: I usually create these pages on the fly, which means that I rarely proofread them and they may contain bad grammar and incorrect details. It also means that I tend to update them regularly (see the history for more details). Feel free to contact me with any suggestions for changes.

This document was generated by Siteweaver on Wed Dec 8 10:37:18 2004.
The source to the document was last modified on Thu Aug 26 20:22:23 2004.
This document may be found at `http://www.cs.grinnell.edu/~rebelsky/Courses/CS152/2004F/Outlines/outline.26.html`.

You may wish to validate this document's HTML ; ; Check with Bobby

Samuel A. Rebelsky, rebelsky@grinnell.edu