Difficulty in comprehending asymptotic notation

Difficulty in comprehending asymptotic notation



As far as I know and research,



Big - Oh notation describes the worst case of an algorithm time complexity.



Big - Omega notation describes the best case of an algorithm time complexity.



Big - Theta notation describes the average case of an algorithm time complexity.



Source



However, in recent days, I've seen a discussion in which some guys tell that



O for worst case is a "forgivable" misconception. Θ for best is plain
wrong. Ω for best is also forgivable. But they remain misconceptions.



The big-O notation can represent any complexity. In fact it can
describe the asymptotic behavior of any function.



I remember as well I learnt the former one in my university course. I'm still a student and if I know them wrongly, please could you explain me?





"some guys" are correct. See: How do O and Ω relate to worst and best case?
– Dukeling
Aug 25 at 20:49






@Dukeling Big - Oh notation describes the worst case of an algorithm time complexity. is wrong????
– concurrencyboy
Aug 25 at 21:01





Yes, pretty much all of this is wrong: "Big - Oh notation is used to define the upper bound of an algorithm in terms of Time Complexity. That means Big - Oh notation always indicates the maximum time required by an algorithm for all input values. That means Big - Oh notation describes the worst case of an algorithm time complexity.". [It is an upper bound, but of a function, not an algorithm.]
– Dukeling
Aug 25 at 21:04





2 Answers
2



Bachmann-Landau notation has absolutely nothing whatsoever to do with computational complexity of algorithms. This should already be very obvious by the fact that the idea of a computing machine that computes an algorithm didn't really exist in 1894, when Bachmann and Landau invented this notation.



Bachmann-Landau notation describes the growth rates of functions by grouping them together in sets of functions that grow at roughly the same rate. Bachmann-Landau notation does not say anything about what those functions mean. They are just functions. In fact, they don't have to mean anything at all.



All that they mean, is this:



It does not say anything about what f or g are, or what they mean.



Note that there are actually two conflicting, incompatible definitions of Ω; The one given here is the one that is more useful for computational complexity theory. Also note that these are only very broad intuitions, when in doubt, you should look at the definitions.



If you want, you can use Bachmann-Landau notation to describe the growth rate of a population of rabbits as a function of time, or the growth rate of a human's belly as a function of beers.



Or, you can use it to describe the best-case step complexity, worst-case step complexity, average-case step complexity, expected-case step complexity, amortized step complexity, best-case time complexity, worst-case time complexity, average-case time complexity, expected-case time complexity, amortized time complexity, best-case space complexity, worst-case space complexity, average-case space complexity, expected-case space complexity, or amortized space complexity of an algorithm.



These assertions are at best inaccurate and at worst wrong. Here is the truth.



The running time of an algorithm is not a function of N (as it also depends on the particular data set), so you cannot discuss its asymptotic complexity directly. All you can say is that it lies between the best and worst cases.



The worst case, best case and average case running times are functions of N, though the average case depends on the probabilistic distribution of the data, so is not uniquely defined.



Then the asymptotic notation is such that



O(f(N)) denotes an upper bound, which can be tight or not;



Ω(f(N)) denotes a lower bound, which can be tight or not;



Θ(f(N)) denotes a bilateral bound, i.e. the conjunction of O(f(N)) and Ω(f(N)); it is perforce tight.



So,



All of the worst-case, best-case and average-case complexities have a Θ bound, as they are functions; in practice this bound can be too difficult to establish and we content ourselves with looser O or Ω bounds instead.



It is absolutely not true that the Θ bound is reserved for the average case.



Examples:



The worst-case of quicksort is Θ(N²) and its best and average cases are Θ(N Log N). So by language abuse, the running time of quicksort is O(N²) and Ω(N Log N).



The worst-case, best-case and average cases of insertion sort are all three Θ(N²).



Any algorithm that needs to look at all input is best-case, average-case and worst-case Ω(N) (and without more information we can't tell any upper bound).



Matrix multiplication is known to be doable in time O(N³). But we still don't know precisely the Θ bound of this operation, which is N² times a slowly growing function. Thus Ω(N²) is an obvious but not tight lower bound. (Worst, best and average cases have the same complexity.)



There is some confusion stemming from the fact that the best/average/worst cases take well-defined durations, while tthe general case lies in a range (it is in fact a random variable). And in addition, algorithm analysis is often tedious, if not intractable, and we use simplifications that can lead to loose bounds.





I don't think this is true, or even meaningful: "in general the running time of quicksort is O(N²) and Ω(N Log N)". That's probably the kind of thing that leads to the misconceptions above.
– Matt Timmermans
Aug 26 at 1:20





@MattTimmermans if there are some concepts explained wrongly, why don't you prefer to fix them by writing new answer to help??
– concurrencyboy
Aug 26 at 5:38





@MattTimmermans: can you elaborate what is wrong ?
– Yves Daoust
Aug 26 at 9:10





@YvesDaoust "in general the running time of quicksort is O(N²) and Ω(N Log N)" is repeating the problem in the original quote. You appear to be using O and Ω to indicate the worst- and best-case running time of the algorithm.
– Dukeling
Aug 26 at 9:56






@Dukeling: I explained that the general running time is not a function and discussions about it lack rigor.
– Yves Daoust
Aug 26 at 9:59







By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Popular posts from this blog

𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

Edmonton

Crossroads (UK TV series)