Sorting method Average situation Best case Worst case scenario Auxiliary space stability
Bubble sort O(n^2) O(n) O(n^2)
O(1) stable Select sort O(n^2) O(n^2)
O(n^2) O(1) instable Insert sort O(n^2)
O(n) O(n^2) O(1) stable
Shell Sort O(n*log(n))~O(n^2) O(n^1.3) O(n^2) O(1) instable
Heap sort O(n*log(n)) O(n*log(n)) O(n*log(n)) O(1)
instable Merge sort O(n*log(n)) O(n*log(n)) O(n*log(n)) O(n)
stable Quick sort O(n*log(n)) O(n*log(n)) O(n^2)
O(1) instable
After optimization of bubble sort , The best time complexity can be achieved O(n). Set a flag bit , If there is one comparison where no exchange occurs , Can end early , Therefore, in the case of positive sequence , The time complexity is O(n). Select sort at worst and best , Must select the smallest of the remaining sequences ( large ) The number of , Exchange with the next position element of the ordered sequence , The best time complexity and worst time complexity are both O(n^2). Insert sort is to insert the next element of the ordered sequence into the previous sorted one ( You need to choose the right location ) In the sequence of , In the case of positive sequence, the time complexity is O(n). A heap is a complete binary tree , So the depth of the tree must be log(n)+1, The best and worst time complexity are both O(n*log(n)). Merge sort is to divide a large array into two small arrays , Recursion in turn , Equivalent to a binary tree , The depth is log(n)+1, So the best and worst time complexity is O(n*log(n)). Quick sort in positive or reverse order , The subsequence of each partition is only one less record than the previous partition , Draw it with a recursive tree , It's a slanting tree , It is necessary at this time n-1 Subrecursion , And i The secondary division should go through n-i Second keyword comparison to find i Records , So the time complexity is \sum_{i=1}^{n-1}(n-i)=n(n-1)/2, Namely O(n^2).
Technology
Daily Recommendation