question 3b has an "obligatory comment" saying the complexity of our code must be less than O(m^2*n) assuming n is larger by magnitudes than m (as it's the number of elements in the longest list) wouldn't the worst case complexity (where all lists are n elements long) be O(m*n^2)?
Date: 01 Dec 2014 08:55
Number of posts: 2
RSS: New posts
1) Why would n be "larger by magnitudes than m"?
The number of lists could be much larger than n. These two parameters should be treated as independent for the complexity analysis.
2) The worst case is not about how large n or m are. It is about the worst possible scenario, given some fixed n and m.
More generally, when we talk about worst (best) case, we ask: among all possible inputs of some fixed size, what inputs cause maximal (minimal) running time?
3) in this case there should not really be a difference between worst and best case. Much like in selection-sort, where the algorithm performs roughly the same amount of operations for all inputs of the same size.
This is in contrast, for example, to binary search, in which inputs (list and key) of the same size, may yield different complexity.