Divide and conquer quick sort

Divide and conquer quick sort

In this “DIVIDE AND CONQUER QUICK SORT” article, You can learn the Quicksort algorithm using the Divide and Conquer technique. Here covers, QuickSort introduction, Example, Algorithm, Pseudo Code, C & Python program, another illustration and finally end with the complexity.

DIVIDE AND CONQUER QUICK SORT

Quick Sort is also based on the Divide and Conquer technique, just like merge sort. But in quicksort, all the heavy lifting(major work) is done while dividing the array into subarrays, while in the case of merge sort, all the real work happens during merging the subarrays. In the case of quicksort, the combining step does absolutely nothing.
It is also called partition-exchange sort. This algorithm divides the list into three main parts:

  1. Elements less than the Pivot element
  2. Pivot element(Central element)
  3. Elements greater than the pivot element

According to the Wikipedia page on Quick sort, this qualifies as an in-place algorithm, as the algorithm is just swapping elements within the input data structure.

Quick Sort is an unstable algorithm because we do swapping of elements according to pivot’s position.

Since we know that quick sort is published in 1961, it is still a commonly used algorithm for sorting. When implemented well, it can be about two or three times faster than its main competitors, merge sort and heap sort.

Complexity chart
Complexity chart

QUICK SORT EXAMPLE

Quick Sort Example
Quick Sort Example

From the given example, Pick an element, called a pivot, from the array, say 2. Partitioning: reorder the array so that all elements with values less than the pivot come before the pivot, while all elements with values greater than the pivot come after it (equal values can go either way). After this partitioning, the pivot is in its final position. This is called the partition operation. After breaking it, we get [ -2, 1, 0, -4] and [7, 4, 6, 5] when placing pivot as the middle. This is in-place sorting, so no additional space required to break it, thus many images shown the illustration by considering one array. Recursively apply the above steps to the sub-array of elements with smaller values and separately to the sub-array of elements with greater values.

QUICK SORT ALGORITHM

Step 1 − Choose the highest index value as pivot
Step 2 − Take two variables to point left and right of the list excluding pivot
Step 3 − left points to the low index
Step 4 − right points to the high
Step 5 − while value at left is less than pivot move right
Step 6 − while value at right is greater than pivot move left
Step 7 − if both step 5 and step 6 does not match swap left and right
Step 8 − if left ≥ right, the point where they met is new pivot

QUICK SORT PSEUDO CODE

QUICK SORT C PROGRAM:

QUICK SORT PYTHON PROGRAM:

INPUT-OUTPUT

Short illustration of Quick Sort:

Partition structure
Partition structure

In this figure, At first 70 is taken as pivot element. After that, {10, 30, 40, 50} are less than 70 so it taken as left array and {80, 90} are greater than 70 so it taken right array.

We recursively perform three steps:

1. Bring the pivot to its appropriate position such that the left of the pivot is smaller and the right is greater.

2. Quick sort the left part.

3. Quick sort the right part.

During each pass, if pivot value is greater than array value(s) then pivot value is not changed but yes, array values at ith and jth position values are changed. That is all lesser values will be stored in the first half, then pivot value, then at right half all greater values.

After that, the pivot value is placed at the correct position but both the left and the right half is not sorted, to sort them, (since quick sort a recursive function), we call the partition function again for the left half and take 50 as a pivot. As it is already at its correct position we call the quicksort function again on the left part where 40 taken as the pivot element. Similarly, 30 and 10 also become pivot, and the left array is sorted.

After that, Coming to the right part and take 80 as a pivot. After that, pivot value 80 is placed in the correct position. The array is sorted fully.

COMPLEXITY of QUICK SORT

BEST CASE ANALYSIS

Each time array size is divided into two half, so the depth of the recursion is log n. At each level of the recursion, all the partitions at that level do work( the pivot placed at correct index ) that is linear in n. i.e. O(log n) * O(n) = O(n log n). Hence, in the best case, Quicksort has time complexity O(n log n).

WORST CASE ANALYSIS

In the worst case, partitioning always divides the size n array into these three parts:

  • A part of length one, containing the pivot itself.
  • A length zero part, and
  • A length n-1 part, containing everything else.

We do not recur on the zero-length part. So, Quicksort calls itself recursively n-1 times. Considering, n-1 as n, so recursion may be n levels deep(for an array of size n). And At each level of the recursion, all the partitions at that level do work( the pivot placed at correct index ) that is linear in n. i.e. O(n) * O(n) = O(n 2 ). So the worst case for Quicksort is O(n^2 ).

The worst-case can be achieved in two common cases:
When the array is already sorted.
When the array is inversely sorted (sorted in the opposite order).

In the Best case(pivot at middle position) and Average case(pivot at random position), Time complexity of Quick sort is O( n log n). But in the Worst case(pivot at the end/beginning), it takes O(n^2).

You might like: Activity-selection-problem, FCFS scheduling, Memory Management, PL/SQL example set, SQL preparation notes.

Leave a Reply

Your email address will not be published. Required fields are marked *

Close Bitnami banner
Bitnami