Learning data structures and algorithms can be tricky; memorizing what O(n) means is one thing, actually knowing it such that you intuitively understand its meaning, i.e., can *see *what it means, is another.

Along these lines, I will here discuss how I see O(n!).

O(n!) says that an algorithm’s work will increase with relation to input size n at a rate of current iteration volume * prior iteration volume.

Think of it like a loop where the number of iterations of the loop equals n!, i.e., an input size of n will mean n! iterations.

As we know from high school math, n! means 1 * 2 * 3 … * n; this means we know something about any given input size’s neighboring iteration volumes: any sample input n1 of n has two predictable neighbor iteration volumes, ((n1-1)! * n1) and ((n1! * (n1 + 1)).

This is how I see rate of change as being central to the concept of algorithmic complexity represented by big O: reliably predicting the the rate at which an algorithm’s complexity increases or decreases means, given an input size of n, we will know n’s neighboring operation volumes, making any specific increases or decreases to n irrelevant to the complexity, which is why we drop constants.

Next installment on O(log n) coming soon!