Page 27 - DCAP201_FUNDAMENTALS_OF_DATA_STRUCTURES
P. 27
Fundamentals of Data Structures
Notes We need to find c and n such that:
0
2
2
3n + 4n – 2 ≤ = cn for all n ≥ n .
0
2
Divide both sides by n , getting:
3 + 4/n – 2/n ≤ c for all n ≥ n .
2
0
If we choose n equal to 1, then we need a value of c such that:
0
3 + 4 – 2 ≤ c
We can set c equal to 6. Now we have:
2
3n + 4n – 2 ≤ 6n for all n ≥ 1 .
2
3
2
Show n != O(n ). Let’s assume to the contrary that
n = O(n )
3
2
Then there must exist constants c and n such that
0
2
3
n ≤ cn for all n ≥ n .
0
Dividing by n , we get:
2
n ≤ c for all n ≥ n .
0
But this is not possible; we can never choose a constant c large enough that n will never exceed
2
it, since n can grow without bound. Thus, the original assumption, that n = O(n ), be wrong so
3
3
n != O(n ).
2
Big O gives us a formal way of expressing asymptotic upper bounds, a way of bounding from
above the growth of a function.
Notes Knowing where a function falls within the big-O hierarchy allows us to compare it
quickly with other functions and gives us an idea of which algorithm has the best time
performance.
2.3.1 Growth Rate Functions
As we know Big O notation is a convenient way of describing the growth rate of a function and
hence the time complexity of an algorithm.
Let n be the size of the input and f (n), g(n) be positive functions of n.
The time efficiency of almost all of the algorithms can be characterized by only a few growth
rate functions:
O(l) - Constant Time
This means that the algorithm requires the same fixed number of steps regardless of the size of
the task.
Example: (assuming a reasonable implementation of the task):
Push and Pop operations for a stack (containing n elements);
Insert and Remove operations for a queue.
20 LOVELY PROFESSIONAL UNIVERSITY