Asymptotic Analysis
Introduction
- Often a choice of algorithms, data structures
- Must choose most appropriate
- Different time requirements
- Different space requirements
- Always tradeoffs between the two!
- Must have objective basis for making choices
Asymptotic Analysis
- Rate at which usage of time or space grows with input size
- Dependent on many factors
particular machine, particular compiler, etc.- Want to describe algorithm apart from these concerns
- Analyze rate of growth, not absolute usage: asymptotic analysis
- Always assume some operations take constant time
Big-Oh Notation
- Resource requirements grow proportionally with some known function
- T(n) = O(f(n)): for some positive constants c, n0 T(n) < cf(n) if n > n0
- Read: "T(n) is big-oh of f(n)"
- Gives an upper bound on T's complexity
- There are other, similar notations for lower bounds, etc.
- May come up with different bounds for best, average, worst cases
Using Big-Oh Notation
- Gives an upper bound, but not unique!
- n3 - 2n = O(n3) and = O(n15)
- Smaller bounds more helpful -- more precisely describe behavior
- Not symmetric: f(n) = O(g(n)) doesn't mean g(n) = O(f(n))
Common Big-Oh Bounds
- O(1): constant time example: pointer dereference
- O(lg n): logarithmic time example: binary search
- O(n): linear time example: linear search
- O(n lg n) example: optimal sorts
- O(nk): polynomial time example: insertion sort is O(n2)
- O(2k): exponential time example: badly-written LCS
Choosing Between Alternative Algorithms
- May have same Big-Oh bound, but may be very different!
- Asymptotic analysis ignores constant factors
- 500n2 + 85n + 10 and 2n2 + 3 are both O(n2)
- Choose based on size of input (one may perform better on
smaller input, one on larger), ease of coding, time-space tradeoffs, type of input