<> appointment

The elements to be sorted need to be implemented Java Of Comparable Interface , The interface has compareTo() method , It can be used to judge the size relationship of two elements .

Using auxiliary functions less() and swap() To compare and exchange , Make the code more readable and portable .

The cost model of sorting algorithm is the number of comparison and exchange .
public abstract class Sort<T extends Comparable<T>> { public abstract void sort
(T[] nums); protected boolean less(T v, T w) { return v.compareTo(w) < 0; }
protected void swap(T[] a, int i, int j) { T t = a[i]; a[i] = a[j]; a[j] = t; }
}
<> Select sort

Select the smallest element from the array , Swap it with the first element of the array . Then select the smallest element from the remaining elements of the array , Swap it with the second element of the array . Do it all the time , Until the entire array is sorted .

Select sort needs ~N2/2 Second comparison and ~N Secondary exchange , Its run time is independent of input , This feature makes it need so many comparison and exchange operations for an array that has been sorted .

public class Selection<T extends Comparable<T>> extends Sort<T> { @Override
public void sort(T[] nums) { int N = nums.length; for (int i = 0; i < N - 1; i++
) { int min = i; for (int j = i + 1; j < N; j++) { if (less(nums[j], nums[min]))
{ min = j; } } swap(nums, i, min); } } }
<> Bubble sort

From left to right, the adjacent elements in reverse order are exchanged continuously , After a cycle , You can float the largest unordered element up to the right .

In a cycle , If there is no exchange , So the array is already ordered , You can exit directly at this time .

public class Bubble<T extends Comparable<T>> extends Sort<T> { @Override public
void sort(T[] nums) { int N = nums.length; boolean isSorted = false; for (int i
= N - 1; i > 0 && !isSorted; i--) { isSorted = true; for (int j = 0; j < i; j++)
{ if (less(nums[j + 1], nums[j])) { isSorted = false; swap(nums, j, j + 1); } }
} } }
<> Insert sort

Each time the current element is inserted into the sorted array on the left , The array on the left is still in order after insertion .

For arrays {3, 5, 2, 4, 1}, It has the following reverse order :(3, 2), (3, 1), (5, 2), (5, 4), (5, 1), (2, 1),
(4, 1), Insert sort can only exchange adjacent elements at a time , Reduce the number of reverse order 1, Therefore, the number of times the insertion sort needs to be exchanged is the number in reverse order .

The time complexity of the insert sort depends on the initial order of the array , If the array is partially ordered , Then the reverse order is less , Fewer exchanges are required , Low time complexity .

* On average, insertion sort requires ~N2/4 Comparison and ~N2/4 Secondary exchange ;
* In the worst case ~N2/2 Comparison and ~N2/2 Secondary exchange , In the worst case, the array is in reverse order ;
* In the best case N-1 Second comparison and 0 Secondary exchange , In the best case, the array is ordered .
public class Insertion<T extends Comparable<T>> extends Sort<T> { @Override
public void sort(T[] nums) { int N = nums.length; for (int i = 1; i < N; i++) {
for (int j = i; j > 0 && less(nums[j], nums[j - 1]); j--) { swap(nums, j, j - 1)
; } } } }
<> Shell Sort

For large arrays , Insertion sort is slow , Because it can only exchange adjacent elements , You can only reduce the number in reverse order at a time
1. The emergence of Hill sort is to solve the limitation of insertion sort , It exchanges non adjacent elements , You can reduce the number of reverse orders by more than 1.

Hill sort uses insert sort to pair the intervals h To sort . By decreasing h, Final order h=1, So that the entire array is ordered .

public class Shell<T extends Comparable<T>> extends Sort<T> { @Override public
void sort(T[] nums) { int N = nums.length; int h = 1; while (h < N / 3) { h = 3
* h + 1; // 1, 4, 13, 40, ... } while (h >= 1) { for (int i = h; i < N; i++) {
for (int j = i; j >= h && less(nums[j], nums[j - h]); j -= h) { swap(nums, j, j
- h); } } h = h / 3; } } }
Hill sort runs less than square , Use incremental sequence 1, 4, 13, 40, … The number of comparisons required for the Hill sort will not exceed N
Multiply several times the length of the increasing sequence . The advanced sorting algorithm introduced later is only about twice as fast as Hill sort .

<> Merge sort

The idea of merge sort is to divide the array into two parts , Sort them separately , And then merge .

<>1. Merging method

The merge method merges two sorted parts of an array into one .
public abstract class MergeSort<T extends Comparable<T>> extends Sort<T> {
protected T[] aux; protected void merge(T[] nums, int l, int m, int h) { int i =
l, j = m + 1; for (int k = l; k <= h; k++) { aux[k] = nums[k]; // Copy data to secondary array }
for (int k = l; k <= h; k++) { if (i > m) { nums[k] = aux[j++]; } else if (j > h
) { nums[k] = aux[i++]; } else if (aux[i].compareTo(aux[j]) <= 0) { nums[k] =
aux[i++]; // Do this first , Ensure stability } else { nums[k] = aux[j++]; } } } }
<>2. Top down merge sort

Divide a large array into two small arrays to solve .

Because every time the problem is divided into two sub problems , The complexity of this bisection algorithm is generally as follows O(NlogN).
public class Up2DownMergeSort<T extends Comparable<T>> extends MergeSort<T> {
@Override public void sort(T[] nums) { aux = (T[]) new Comparable[nums.length];
sort(nums, 0, nums.length - 1); } private void sort(T[] nums, int l, int h) { if
(h <= l) { return; } int mid = l + (h - l) / 2; sort(nums, l, mid); sort(nums,
mid+ 1, h); merge(nums, l, mid, h); } }
<>3. Bottom up merge sort

Merge those tiny arrays first , And then merge them in pairs to get the micro array .
public class Down2UpMergeSort<T extends Comparable<T>> extends MergeSort<T> {
@Override public void sort(T[] nums) { int N = nums.length; aux = (T[]) new
Comparable[N]; for (int sz = 1; sz < N; sz += sz) { for (int lo = 0; lo < N - sz
; lo += sz + sz) { merge(nums, lo, lo + sz - 1, Math.min(lo + sz + sz - 1, N - 1
)); } } } }
<> Quick sort

<>1. Basic algorithm

* Merge sort divides the array into two subarrays and sorts them separately , And merge the ordered subarray to make the whole array sort ;
* Quicksort divides an array into two subarrays with a shard element , The left subarray is less than or equal to the sliced elements , The right subarray is greater than or equal to the split element , Sorting these two subarrays will sort the entire array .
public class QuickSort<T extends Comparable<T>> extends Sort<T> { @Override
public void sort(T[] nums) { shuffle(nums); sort(nums, 0, nums.length - 1); }
private void sort(T[] nums, int l, int h) { if (h <= l) return; int j =
partition(nums, l, h); sort(nums, l, j - 1); sort(nums, j + 1, h); } private
void shuffle(T[] nums) { List<Comparable> list = Arrays.asList(nums);
Collections.shuffle(list); list.toArray(nums); } }
<>2. segmentation

take a[l]
As a segmentation element , It then scans right from the left end of the array until it finds the first element greater than or equal to it , Scan the right end of the array to the left to find the first element smaller than it , Exchange these two elements . This process continues , You can guarantee the left pointer
i None of the left elements of is greater than the shard element , Right pointer j The right element of is not less than the slicing element . When two pointers meet , The element will be sliced a[l] and a[j] change of position .

private int partition(T[] nums, int l, int h) { int i = l, j = h + 1; T v =
nums[l]; while (true) { while (less(nums[++i], v) && i != h) ; while (less(v,
nums[--j]) && j != l) ; if (i >= j) break; swap(nums, i, j); } swap(nums, l, j);
return j; }
<>3. performance analysis

Quicksort is sort in place , Auxiliary arrays are not required , But recursive calls require an auxiliary stack .

The best way to do a quick sort is to split the array exactly in half each time , This is the least number of recursive calls . In this case, the number of comparisons is CN=2CN/2+N, The complexity is O(NlogN).

In the worst case , First slice from the smallest element , The second cut from the second smallest element , So and so on . So the worst-case comparison is needed N2
/2. To prevent arrays from being ordered at the beginning , The array needs to be randomly scrambled during quick sort .

<>4. Algorithm improvement

<>4.1 Switch to insert sort

Because quicksort calls itself recursively in small arrays , For small arrays , Insert sort performs better than quick sort , So you can switch to insert sort in a small array .

<>4.2 Take the middle of three numbers

The best case is to take the median of the array every time , But calculating the median is costly . One compromise is to take 3 Elements , The element with the middle size is used as the segmentation element .

<>4.3 Three way segmentation

For arrays with a large number of duplicate elements , You can split an array into three parts , Respectively corresponding to less than , Equal to and greater than the split element .

For random arrays with a large number of duplicate elements, the sorting can be completed in linear time .
public class ThreeWayQuickSort<T extends Comparable<T>> extends QuickSort<T> {
@Override protected void sort(T[] nums, int l, int h) { if (h <= l) { return; }
int lt = l, i = l + 1, gt = h; T v = nums[l]; while (i <= gt) { int cmp = nums[i
].compareTo(v); if (cmp < 0) { swap(nums, lt++, i++); } else if (cmp > 0) { swap
(nums, i, gt--); } else { i++; } } sort(nums, l, lt - 1); sort(nums, gt + 1, h);
} }
<>5. Fast selection algorithm based on segmentation

Quicksort partition() method , Returns an integer j bring a[l…j-1] Less than or equal to a[j], And a[j+1…h] Greater than or equal to a[j], here
a[j] That's the number one of the array j Big elements .

You can use this feature to find the first k Elements .

The algorithm is linear , Suppose you can bisect an array at a time , So the total number of comparisons is (N+N/2+N/4+…), Until we find number one k Elements , This sum is obviously less than 2N.
public T select(T[] nums, int k) { int l = 0, h = nums.length - 1; while (h > l
) { int j = partition(nums, l, h); if (j == k) { return nums[k]; } else if (j >
k) { h = j - 1; } else { l = j + 1; } } return nums[k]; }
<> Heap sort

<>1. heap

The value of a node in the heap is always greater than or equal to the value of its child nodes , And the heap is a complete binary tree .

You can use heaps to represent arrays , This is because the heap is a complete binary tree , A complete binary tree is easily stored in an array . position k The parent node location of the node of is k/2, The location of its two child nodes is 2k and
2k+1. The array index is not used here 0 Location of , It is to describe the location relationship of nodes more clearly .

public class Heap<T extends Comparable<T>> { private T[] heap; private int N =
0; public Heap(int maxN) { this.heap = (T[]) new Comparable[maxN + 1]; } public
boolean isEmpty() { return N == 0; } public int size() { return N; } private
boolean less(int i, int j) { return heap[i].compareTo(heap[j]) < 0; } private
void swap(int i, int j) { T t = heap[i]; heap[i] = heap[j]; heap[j] = t; } }
<>2. Floating and sinking

In the pile , When a node is larger than the parent node , Then you need to swap the two nodes . The swap may also be larger than its new parent , Therefore, the comparison and exchange operations need to be carried out continuously , This operation is called buoyancy .

private void swim(int k) { while (k > 1 && less(k / 2, k)) { swap(k / 2, k); k
= k / 2; } }
Similarly , When a node is smaller than a child node , It also requires constant downward comparison and exchange operations , This operation is called sinking . If a node has two children , It should be exchanged with the largest of the two child nodes .

private void sink(int k) { while (2 * k <= N) { int j = 2 * k; if (j < N &&
less(j, j + 1)) j++; if (!less(k, j)) break; swap(k, j); k = j; } }
<>3. Insert element

Place the new element at the end of the array , Then float up to the right position .
public void insert(Comparable v) { heap[++N] = v; swim(N); }
<>4. Delete maximum element

Removes the largest element from the top of the array , And put the last element of the array to the top , And let the element sink into place .
public T delMax() { T max = heap[1]; swap(1, N--); heap[N + 1] = null; sink(1);
return max; }
<>5. Heap sort

Swap the largest element with the last element of the array in the current heap , And don't delete it , Then you get a decreasing sequence from the end to the end , From a positive point of view is an increasing sequence , This is heap sort .

<>5.1 Build heap

The most direct way to build the heap of unordered array is to traverse the array from left to right to float . A more efficient method is to sink from right to left , If both nodes of a node are already heap ordered , Then the sink operation can make this node the heap order of the root node . Leaf nodes do not need to sink , Elements of leaf nodes can be ignored , So you only need to traverse half of the elements .

<>5.2 Swap the top element with the last element

After swapping, sinking operation is needed to maintain the orderly state of the heap .

public class HeapSort<T extends Comparable<T>> extends Sort<T> { /** * Array No 0
Cannot have elements at positions */ @Override public void sort(T[] nums) { int N = nums.length - 1; for
(int k = N / 2; k >= 1; k--) sink(nums, k, N); while (N > 1) { swap(nums, 1, N--
); sink(nums, 1, N); } } private void sink(T[] nums, int k, int N) { while (2 *
k<= N) { int j = 2 * k; if (j < N && less(nums, j, j + 1)) j++; if (!less(nums,
k, j)) break; swap(nums, k, j); k = j; } } private boolean less(T[] nums, int i,
int j) { return nums[i].compareTo(nums[j]) < 0; } }
<>6. analysis

The height of a heap is logN, Therefore, the complexity of inserting and deleting the largest element in the heap is logN.

For heap sort , Because you have to be right N Sink nodes , So the complexity is NlogN.

Heap sort is a sort in place , No extra space was used .

Heap sort is rarely used in modern operating systems , Because it can't use locality to cache , That is, array elements are rarely compared and exchanged with adjacent elements .

<> Summary

<>1. Comparison of sorting algorithms

algorithm stability Time complexity Spatial complexity remarks
Select sort × N2 1
Bubble sort √ N2 1
Insert sort √ N ~ N2 1 The time complexity is related to the initial order
Shell Sort × N Multiply several times the length of the increasing sequence 1 Improved insert sort
Quick sort × NlogN logN
Quick sort of three-way segmentation × N ~ NlogN logN Suitable for large numbers of duplicate primary keys
Merge sort √ NlogN N
Heap sort × NlogN 1 The principle of locality cannot be used
Quicksort is the fastest general sorting algorithm , It has very few inner loop instructions , And it can also take advantage of the cache , Because it always accesses data sequentially . Its running time is approximately ~cNlogN, there c
It is smaller than other sorting algorithms of linear logarithm level .

Quick sort using three-way slitting , In practical application, the input of some distribution can reach the linear level , Other sorting algorithms still need linear logarithmic time .

<>2. Java Implementation of sorting algorithm based on

Java The main sorting methods are java.util.Arrays.sort(), Quick sort using triad for raw data types , Use merge sort for reference types .

Technology
©2019-2020 Toolsou All rights reserved,
Final review of database : Summary of comprehensive application questions Laplance operator ( Second derivative ) Simple learning of computer composition principle pyqt Button call python program _PyQt: Link button to function in program How much can you go up once you change jobs ? Today, I saw the ceiling of job hopping python in str Function usage _python in str Usage Summary of built-in functions MySQL trigger web The server nginx---linux Installation and deployment C++ Chapter V polymorphism exercises :( It's coming to an end )python Check built-in functions , How to check python Built in function