Page 260 - DCAP605_ADVANCED_DATA_STRUCTURE_AND_ALGORITHMS
P. 260
Unit 12: Sorting
When qsort is activated first time key = 30, and i =2, and j =7, i is incremented till it becomes 3, Notes
because at position 3, the value is greater than key, j is not decremented, because at position 7,
the value that we have is less than the key. Since i<j, we interchange the 3rd element and 7th
element. Then i is incremented till it becomes 6, and j is decremented till it becomes 5. Since i > j,
we interchange the key element that is the element at position 1, with the element at position 5,
and call qsort recursively with the left sub-list made of elements from position 1 to 4, and right
sub-list made of elements from position 6 to 7 as shown below:
1 2 3 4 5 6 7 Indices
27 20 05 10 30 70 50
Left sub-list Right sub-list
By continuing in this fashion, we finally get the sorted list.
The average case time complexity of the quick sort algorithm can be decided as follows:
We assume that every time the list gets splitted into two approximately equal sized sub-lists. If
the size of a given list is n, then it gets splitted into two sub lists of size approximately n/2. Each
of these sub -lists further gets splitted into two sub-lists of size n/4, and this is continued till the
size becomes 1. When the quick sort works with a list of size n it places the key element (which
we take the first element of the list under consideration) at its proper position in the list. This
requires no more than n iterations. After placing the key element at its proper position in the list
of size n, quick sort activates itself two times to work with left and right sub-lists, each assumed
to be of size n/2. Therefore if T(n) is a time required to sort a list of size n. Since the time required
to sort the list of size n is equal to the sum of the time required to place the key element at its
proper position in the list of size n and the time required to sort the left and right sub-lists each
assumed to be of size n/2, T(n) comes out to be:
T(n) = c*n + 2*T(n/2)
Where c is a constant and T(n/2) is the time required to sort the list of size n/2.
Similarly the time required to sort the list of size n/2 is equal to the sum of the time required to
place the key element at its proper positions in the list of size n/ 2 and the time required to sort
the left and right sub-lists each assumed to be of size n/4. T(n/2) comes out to be:
T(n/2) = c*n/2 + 2*T(n/4)
Where T(n/4) is the time required to sort the list of size n/4.
T(n/4) = c*n/4 + 2*T(n/8), and so on and finally we get T(1) = 1.
T(n) = c*n + 2(c*n(n/2) + 2T(n/4))
T(n) = c*n + c*n + 4T(n/4)) = 2*c*n + 4T(n/4) = 2*c*n + 4( c*(n/4) + 2T(n/8))
T(n) = 2*c*n + c*n + 8T(n/8) = 3*c*n + 8T(n/8)
T(n) = (log n)*c*n + nT(n/n)= (log n)*c*n + nT(1) = n + n*(log n) *c
T(n) nlog(n)
Therefore we conclude that the average time complexity of the quick sort algorithm is O(nlog D).
2
But the worst case time complexity is of the O(n ).
The reason for this is in the worst case one of the two sub-lists will always be empty, and the
other will be of the size (n-1). Where n is the size of the original list. Therefore in the worst case
LOVELY PROFESSIONAL UNIVERSITY 255