D6 - Useful Complexity Case, log(n!) = Θ(nlogn)
Data Structures, Complexity
log(n!) = Θ(nlogn)
This is a useful case of complexity analysis to remember. log(n!) and nlogn has about the same growth rate. To prove it's true, we will show how both log(n!) = O(nlogn) and log(n!) = Ω(nlogn) are true.
log(n!) = O(nlogn):
log(n!) = log(n(n-1)(n-2)(n-3)*...*2*1)
This is by definition of factorial.
log(n(n-1)(n-2)(n-3)*...*2*1) ≤ log(n*n*n...*n) // *right side is nlog(n)*
Therefore, log(n!) = O(nlogn).
log(n!) = Ω(nlogn):
log(n!) = log(n(n-1)(n-2)(n-3)*...*2*1)
log(n(n-1)(n-2)(n-3)*...*2*1) ≥ log(n(n-1)(n-2)...(n-(n/2 + 1))
The upper half of the numbers multiplied by the n! is smaller than the whole n! (e.g. 6*5*4*3*2*1 is bigger than 6*5*4).
log(n(n-1)(n-2)...(n-(n/2 + 1)) ≥ log((n/2 + 1)(n/2 + 1)....(n/2 + 1))
// *for n/2 + 1 times*
The numbers picked on the last equation, are bigger than the smallest one multiplied by itself for the same amount of times (continuing the previous example, 6*5*4 is bigger than 4*4*4).
Now:
log((n/2 + 1)(n/2 + 1)....(n/2 + 1)) = (n/2)log(n/2 + 1)
(n/2)log(n/2 + 1) ≥ (n/2)log(n/2)
Using logarithmic rules, we take the exponent out of the log argument and multiply the log by it. Eventually reducing 1 resulting in an even smaller expression (taking our previous example to logs, we do this: log(4*4*4) = 3log(4) ≥ 3log(3)).
Next, by the definition of Ω(), we want to find a C (constant) and n0 (certain n) which for every n > n0: (n/2)log(n/2) ≥ C*nlogn.
(n/2)log(n/2) = (n/2)(log(n) - log2) = (n/2)(log(n) - 1)
Now, if we're taking C = 1/4 and n0 = 4, we can keep going:
(n/2)(log(n) - 1) ≥ (n/2)(log(n) - log(n)/2)
Note that 0.5(log(n)) is equal to 1(or more), since we picked n0 = 4.
(n/2)(log(n) - log(n)/2) = (n/2)(log(n)/2) = (1/4)(nlog(n)) = C * nlogn
And we're done! We've shown that even a reduced value/form of log(n!) is in the group of Ω(nlogn). And since we've already proved big O, we got our final proof - log(n!) = Θ(nlogn).
RwK