The Quick Sort Algorithm
This sort was invented by Charles Anthony Richard (aka Tony) Hoare and first appeared in a 1962 paper.
The idea of this sorting algorithm is to choose one of the values of the sequence (called the pivot) and then separate the rest of the values into two groups as follows: Those less than the pivot are placed "below" the pivot (or on its "left", depending on the point of view), and those values greater than or equal to the pivot are placed "above" the pivot (or on its "right"), and this action is then repeated (recursively) on these two groups.
Name: SortByQuick
Input:
Output:
Algorithm SortByQuick(s, firstPosition, lastPosition) ===================================================== if (firstPosition < lastPosition) Partition(s, firstPosition, lastPosition, pivotPosition) SortByQuick(s, firstPosition, pivotPosition-1) SortByQuick(s, pivotPosition+1, lastPosition)
See here for an example (a pdf document).
Algorithm Partition(s, firstPosition, lastPosition, pivotPosition) ================================================================== Choose the middle position in [firstPosition, lastPosition] to be the pivotPosition Exchange the value in the first position with the value in pivotPosition Let an index called lastSmall point at the new pivotPosition (the first position) for each value from the second position onward if the value in that position is < the pivot (the value in pivotPosition) Increment lastSmall and exchange the value in that position with the one at lastSmall Exchange the value at the pivotPosition with the value at the lastSmall position Return this new pivotPosition (as a reference parameter if this is a void function)
Note that pivotPosition is an out-parameter, not an in-parameter, and when pivotPosition is sent back, all values to its left are < the pivot, while all values to its right are >= the pivot.
We give an informal argument (rather than a formal proof) to try to convince you that the complexity of the Quick Sort is O(n*log n).
First, we note that each time the algorithm is called, two things happen:
The fact that for each split (of the approximately log n splits) there are as many as 2n additional operations for the comparisons and possible movements means that there are as many as 2n*(log n) operations required in all, from which we may reasonably infer that the Quick Sort is O(n*log n). However, if the choice of pivot is uniformly bad (so that the split consists of one element in one group and the rest of the elements in the other group each time), then this "quick" sort can in fact degenerate to a quadratic performance.