The complexity of an algorithm function is usually described as a function of the number of inputs it receives.
e.g searching(linear search) a particular element in an array of n integers can be described as
f(n) = 4 + 2n
However, in terms of the performance, we are generally interested in the terms which grow faster as n increases than the terms which grow slower as n increases. So we can drop constant '4' from the above function definition. Also the constant factor '2' can be dropped. Thus the function becomes
f(n) = n
This method of removing all factors and of "keeping the largest growing term" as described above is what we call asymptotic behavior. So the asymptotic behavior of f(n) = 2n + 4 is described by the function f(n) = n. In mathematical terms, it means that we are interested in the limit of function f as n tends to infinity (Mathematically, an aysmptote of a curve is line such that the distance between the curve and the line approaches zero as they tend to infinity. Asymptotic analysis is a method of describing limiting behavior)
So, if a program does not contain any loop can be describes as f(n)=1. This is because the number of instructions are fixed. If a program contains 1 loop (through 1 to n), it will be describes as f(n)=n.
Similarly a program with 2 loops(a loop with in a loop) will be f(n)=n*n
A program with 3 loops(a loop within a loop within a loop) will be f(n) = n*n*n
Big-Oh Notation: Its the notation which expresses the worst case complexity of an algorithm i.e the behaviour of an algorithm will never exceed a certain bound.
In the previous example, the complexity of an algorithm having 2 loops can be written as O(n*n). This means that this program's run time is never worse than n*n. It can be better but can never be worse. This gives a good idea about how fast this program runs.
No comments:
Post a Comment