**Tilde notation**.
We say that f(N) ~ g(N) if f(N)/g(N) converges to 1 as N gets large.
This is a general concept about mathematical functions
and is not restricted to running time, memory, or any other specific domain.

**Power-law assumption**.
For empirical analyses in COS 226, we typically assume that the running time
obeys a power law—T(N) ~ aN^{b}.
Empirical derivation of accurate hypotheses for non-power law running times is
beyond the scope of our course.

**Cost model**.
For theoretical analyses of running time in COS 226, we will assume a *cost model*,
namely that some particular operation (or operations) dominates the running time of a program.
Then, we express the running time in terms of the total number of that operation
as a function of the input size.
To simplify things, we usually give this frequency count in Tilde notation.

**Order of growth**.
If we have two functions f(N) and g(N), and f(n) ~ c g(N) for some constant c > 0,
we say the order of growth of f(N) is g(N).
Typically g(N) is one of the following functions:
1, log N, N, N log N, N^{2}, N^{3}, or 2^{N}.

**Performance depends on inputs**.
We can characterize an algorithm's performance by its best case, worst case, and average
case.

**Difficulty of a problem**.
To understand the difficulty of a particular problem in COS 226, we often consider the
worst-case order-of-growth of the best possible algorithm for the problem.
We can upper bound the difficulty of a problem by the performance
of the best-known algorithm. Finding a good lower bound for a problem is
usually a difficult challenge.

**Big Oh, Big Omega, Big Theta**.
These notations are commonly used in the theory of algorithms. They are similar
in spirit to Tilde notation, but throw away leading constants.
Many programmers use Big-Oh notation incorrectly when they really mean
order of growth.

**Worst-case order of growth isn't everything**.
Just because one algorithm has a better order of growth than other does not
mean that it is faster in practice. We will encounter some notable counterexamples,
including quicksort vs. mergesort.

**Memory analysis**.
Know how to calculate the memory utilization of a class with the 64-bit memory model
from the textbook.

**Theoretical and empirical analysis**.
Hypotheses generated through theoretical analysis (or guesswork like our power law
assumption) should be validated with data before being fully trusted.

- Textbook 1.4.4
- Fall 2010 Midterm, #1d
- Fall 2011 Midterm, #2

- Textbook 1.4.5
- Fall 2009 Midterm, #1
- Fall 2010 Midterm, #1a, #1c
- Spring 2012 Midterm, #1
- Spring 2013 Final, #4

- Spring 2013 Midterm, #1
- Suppose the optimal (and possibly unknown) solution to problem P has order of growth F(N). Suppose that the best known solution has runtime that is Θ(B(N)). Finally, suppose that there is a clever proof that no solution can possibly have order of growth that is less than L(N). Which of the following can you surmise?
- F(N) = O(B(N))
- B(N) = O(F(N))
- The limit of F(N)/B(N) as N goes to infinity cannot be infinity.
- F(N) = Ω(L(N))
- L(N) = Ω(F(N))
- B(N) > F(N) for sufficiently large N.