In order to choose the right algorithm for a problem, computer scientists must be able to estimate how long an algorithm will take to run. Here we discuss three important concepts in algorithmic analysis: big Oh notation, recursive runtimes, and expected runtimes.
To gauge algorithmic runtimes, computer scientists invented the big Oh notation system of measuring algorithmic runtimes. Loosely speaking, if we say that an algorithm runs in O(f(n)) time, we mean that in the worst case scenario our algorithm's runtime will be at most some constant multiple of f(n), where n is our input size and f is some mathematical function, and that this constant multiple rule always holds beyond a certain point. In the plot to the left, notice that the green plots seem to have a similar shape to one another and similarly the blue plots seem to have a similar shape, but the shapes are different between colors. That's because the blue algorithms, merge sort and quicksort, would be classified in the big Oh notation sysystem as O(nlogn) algorithms, while the green algorithms, bubble sort and insertion sort, would be classified as O(n^2) algorithms but not O(nlog n) algorithms. Try hovering over the graph to see what this looks like numerically.
Finding the runtime of an algorithm such as "find the minimum of an unsorted array" is straightforward: we check every element from front to back, and in the worst case, the minimum is at the very end, so the runtime is at most a constant multiple of the input size, i.e. O(n). How do we measure the runtimes of more complicated algorithms, in particular recursive algorithms? For that purpose we have the Master Theorem, which states that if a recursion can be written as $$T(n) = \color{green}{a}T(n/\color{blue}{b}) + \color{orange}{f(n)}$$ the runtime can be exactly determined. In this diagram, the green slider represents a, the number of branches, and the blue slider represents b, how large each recursive subproblem is relative to the original input size. f(n) represents the amount of time we spend on each recursive problem. Try moving the sliders around to see how the amount of time, represented by the total area, changes for different values in the formula.
This category occurs when we need quadratic time at each recursive level in order to fully process our input. Examples of such algorithms include matrix multiplication and Strassen's Algorithm.
Worst case analysis is fantastic at guaranteeing our algorithm will always run at some minimum speed. However, sometimes considering average runtime yields much more insight into how fast an algorithm will actually take to run. In the accompanying ilustration, we have a classic use case for expected runtimes, namely hash tables. A hash table is a data structure which maps keys to values, and allows for more general access schemes. The illustration is implemented using chained hashing, i.e. every key corresponds to a bucket of corresponding values, here implemented as a linked list. By having many possible buckets and a good hash function to assign keys, we reduce the average amount of time per operation assuming random assignment. Try pressing the "Insert random key" and "Delete random key" to get a better feel for this data structure. Even though in the worst case we may end up with very time-consuming arrangements, i.e. keys all hashed to the same bucket, the average amount of keys per bucket (the expected value) is low, hence the usefulness of the hash table.