Monday, Oct 23rd

Last update12:59:40 PM GMT

Time Complexity in Algorithms - The Maths

Write e-mail

The concept of time complexity in analyzing algorithms is very intimidating to many people. In this first of 2 articles, I will try to explain what Time Complexity is and what the O (the Big O) notation really means.

argaiv1874

But before we move to Time Complexity, it is important to understand how functions behave.

All functions exhibit a certain behavior as the value of n varies. For example, if f(n) = 1/n, then as n approaches infinity, f(n) gets closer and closer to zero. Whereas if f(n) = n*n, then as n approaches infinity, f(n) grows as well.

Different functions grow differently. Two similar or equal functions grow at the same rate, even if they are separated by a constant multiple. For example, if f(n) = n*n and g(n) = 5*n*n, then f and g will grow at the same pace, because g(n) = 5*f(n), so they are only a constant multiple apart. That is, as n grows arbitrarily large, g(n) / f(n) remains constant, i.e. equal to 5.

Now consider, f(n) = n * n, and g(n) = n. What should be the behavior of f(n) and g(n) as n approaches infinity?

f(n) / g(n) = (n * n) / (n), which simplifies to f(n) / g(n) = n. This means that as n increases to large values, the ratio f(n) / g(n) also increases; f(n) = n * g(n). In this case, f and g are not separated by a constant multiple; rather the multiple grows larger and larger as time progresses. If we try some values of n:

n = 10: f(10) / g(10) = 100 / 10 = 10
n = 100: f(100) / g(100) = 10000 / 100 = 100
n = 1000: f(1000) / g(1000) = 1000000 / 1000 = 1000

we can see how the ratio increases to large values.

Now, consider f(n) = 2 * n * n + 1, and g(n) = n * n. What is the value of this ratio as n approaches large values? The answer is 2. How? Lets see

(2 * n * n + 1) / (n * n) = (2 * n * n) / (n * n) + 1 / (n * n) = 2 + 1 / (n * n).

So f(n) / g(n) = 2 + 1 / (n * n).

As n grows unmanageably, the value of (2 + 1 / n * n) gets closer and closer to 2 (as 1/(n*n) approaches 0). As an example:

f(10) / g(10) = (2*10*10+1)/(10*10) = 201 / 100 = 2.01
f(100) / g(100) = (2*100*100+1)/(100*100) = 2.0001
f(1000) / g(1000) = (2*1000*1000+1)/(1000*1000) = 2.0000001

 So, we can say that f and g are generally growing at the same pace. The multiple which separates them both (i.e. 2) remains constant.

Lets summarize these points into the following 3 rules:

1. If f(n) / g(n) grows out of control, getting larger and larger as n gets large, then f is said to grow faster than g.

2. If f(n) / g(n) settles towards some constant positive value, then f is said to grow at the same pace as g.

3. If f(n) / g(n) gets closer and closer to zero, then this means that its reciprocal, g(n) / f(n), is growing out of control, so g is said to grow faster than f. (Or f is said to grow slower than g.)
 

The big O notation

The notation O(expression) represents the entire set of functions that grow slower than or at the same pace as expression. So, O(n^2) would represent the set of functions that grow slower than or at the same rate as n^2.

Also, if g(n) = n, then as n grows slower than n^2, it follows that g lies in the set O(n^2). Similarly, if h(n) = 2 * n * n + 1, then since h grows at the same pace as n^2, it follows that h lies in the set O(n^2).

It can also be seen that h lies in the set O(n^3), because h grows slower than n^3.#

Therefore, we say that O(expression) represents upper bounds as it represents all the functions that grow slower than or equal to expression.

Other notations

Other notations refer to different sets of functions.

o(expression) represents all the functions that grow slower than expression (and not at the same speed)

f(n) = n^2 + n - 1 lies in the set O(n^2), but it doesn't lie in o(n^2). g(n) = n lies in both.

Theta(expression) represents all the functions that grow at the same rate as expression.

Omega(expression) represents all the functions that grow faster than or equal to expression.

In computer programs, we are interested in how much time it takes a program to run, and these forms of notation are useful for representing that. That's why we use them. When you say an algorithm runs in O(expression) time, you're just saying that the algorithm's runtime is no worse than some multiple of expression.

Time Complexity

---------------------------------------------------------------------------------

#Assuming the leading term is positive, quadratic functions always grow slower than cubic polynomials, which always grow slower than fourth-degree polynomials, which always grow slower than fifth-degree polynomials, and so on.

References: Introduction to Algorithms by Cormen, Rivest, Stein, Eternally Confuzzled

Share this post



Add comment

Please refrain from using slang or abusive language in the comments.
To avoid waiting for your comment to be published after being reviewed, please log in as a registered user.


Security code
Refresh

Web Hosting