## Quicksort algorithm in java

The basic idea behind quick-sort is this: Specify one element in the list as a “pivot” point. Then, go through all of the elements in the list, swapping items that are on the “wrong” side of the pivot. In other words, swap items that are smaller than the pivot but on the right side of the pivot with items that are larger than the pivot but on the left side of the pivot. Once you’ve done all possible swaps, move the pivot to wherever it belongs in the list. Now we can ignore the pivot, since it’s in position, and repeat the process for the two halves of the list (on each side of the pivot). We repeat this until all of the items in the list have been sorted. Quick-sort is an example of a divide and conquer algorithm. Quick sort sorts a list effectively by dividing the list into smaller and smaller lists, and sorting the smaller lists in turn.

**Example:**

As you can see in the above example: after pass 1 all the elements which are less than or equal to 65 are before 65(pivot) and all the other elements which are greater than 65 are at the right side.

Now we need to repeat same process for left half and right half of the array. then we will get sorted array.

__Java implementation for Quick sort:__

The Java implementation of quicksort is a bit more complex than our previous sort algorithms. In fact, we will use two different methods in order to implement quicksort. The first method is the primary method: it sets up the algorithm and then calls the other method, which does the actual sorting.

1 2 3 4 5 6 7 8 9 |
public void quicksort(int[] list, int low, int high) { if (low < high) { int mid = partition(list,low,high); quicksort(list,low,mid-1); quicksort(list,mid+1,high); } } |

Before we get to the partition method, let’s take a close look at the quicksort() method. quicksort() actually calls itself, twice! A method that calls itself is called a recursive method. Recursive methods are useful when we need to repeat some series of tasks number of times with a parameter that varies each time the calculation is done.

Recursive methods have three defining features:

• A base case or stop case, that indicates when the method should stop calling itself

• A test to determine if recursion should continue

• One or more calls to itself

In other words, each time we call the method, we change the parameters that we send to the method, test to see if we should call the method again, and provide some way to ultimately return from the method.

In the quicksort() method, the conditional statement serves as the test. The method is called twice, once for the lower half of the list and once for the upper half of the list. The “stop case” occurs when low is greater than or equal to high: in this case, nothing is done because the condition in the conditional statement is false. The quicksort() method calls another method that selects the pivot and moves the elements in the list around as in the example. It returns the index of the pivot point in the list, which is the item in the list that is now fixed in place.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 |
public int partition(int[] list, int low, int high) { // The pivot point is the first item in the subarray int pivot = list[low]; // Loop through the array. Move items up or down the array so that they // are in the proper spot with regards to the pivot point. do { // can we find a number smaller than the pivot point? keep // moving the ‘‘high’’ marker down the array until we find // this, or until high=low while (low < high && list[high] >= pivot) { high--; } if (low < high) { // found a smaller number; swap it into position list[low] = list[high]; // now look for a number larger than the pivot point while (low < high && list[low] <= pivot) { low++; } if (low < high) { // found one! move it into position list[high] = list[low]; } } } while (low < high); // move the pivot back into the array and return its index list[low] = pivot; return low; } |

__Performance of Quick sort algorithm:__

- The best case of quicksort occurs, obviously, when the list is already sorted. For this algorithm, the best case resembles the average case in terms of performance.
- The average case occurs when the pivot splits the list in half or nearly in half on each pass. Each pass through the algorithm requires N comparisons. But how many passes does quicksort take? Recall that every time we divide something in half repeatedly, as in binary search, the number of operations is approximately log2 N. So, the number of passes through the algorithm is approximately log2N, and thus the number of comparisons in the average and best cases is O(NlogN). The number of swaps differs in the best and average cases: for the best case, we have no swaps, but for the average case, by the same reasoning, we have O(NlogN) swaps.
- The worst case for quicksort occurs when the pivot is always the largest or smallest item on each pass through the list. In this instance, we do not split the list in half or nearly in half, so we do N comparisons over roughly N passes, which means that in the worst case quicksort is closer to O(N2 ) in performance. For the same reason, the number of swaps in the worst case can be as high as O(N2 ).

Let’s see whether quick sort is stable and in-place or not:

**Stability:**

Quick sort is not stable, since it exchanges non adjacent elements.

Suppose the array is:5 2 3 5 1

Let’s distinguish the two 5’s as 5(a) and 5(b) .

So our array is:

5(a) 2 3 5(b) 1

Suppose we choose `3`

as the pivot. The two 5 elements will end up after pivot and the `1`

and the `2 `

before pivot.

After iteration 1: 1 2 3 5(b) 5(a)

after last iteration our array becomes:

1 2 3 5(b) 5(a)

Since now our array is in sorted order and we clearly see that 5(a) comes before 5(b) in initial array but not in the sorted array.

So we can clearly see that quick sort is not stable.

**In place:**

An **in-place algorithm **is an algorithm which transforms input using a data structure with a small, constant amount of extra storage space. The input is usually overwritten by the output as the algorithm executes

you can have in-place version of quick sort and not in-place version of quicksort also.

Quicksort with in-place and unstable partitioning uses only constant additional space before making any recursive call. Quicksort must store a constant amount of information for each nested recursive call. QuickSort as *in-place* – it sorts the elements within the array with *at most* a constant amount of them outside the array at any given time.

Another, not-in-place, version of quicksort uses *O*(*n*) space for working storage and can implement a stable sort. The working storage allows the input array to be easily partitioned in a stable manner and then copied back to the input array for successive recursive calls.

## Leave a Reply