Home > Common Problem > What are the stable sorting algorithms?

What are the stable sorting algorithms?

奋力向前
Release: 2021-09-28 18:58:53
Original
27898 people have browsed it

Stable sorting algorithms include: 1. Bubble sort; 2. Selection sort; 3. Insertion sort; 4. Quick sort; 5. Merge sort; 6. Radix sort; 7. Hill sort (shell ); 8. Heap sorting.

What are the stable sorting algorithms?

The operating environment of this tutorial: Windows 10 system, Dell G3 computer.

Analyze the stability of common sorting algorithms, and give simple reasons for each.

Stable sorting algorithm:

1. Bubble sorting

Bubble sorting Bubble sorting is to move small elements forward or move large elements backward. Comparison is a comparison of two adjacent elements, and exchange also occurs between these two elements. So, if two elements are equal, I don't think you would just swap them boringly.

If two equal elements are not adjacent, then even if the two are adjacent through the previous pairwise exchange, they will not be exchanged at this time, so the order of the same elements has not changed, so there is a risk Bubble sort is a stable sorting algorithm.

2. Selection sorting

Selection sorting selects the smallest current element for each position, for example, selects the smallest for the first position, and selects the smallest among the remaining elements. Select the second smallest element for the second element, and so on, until the n-1th element. The nth element does not need to be selected, because it is the only largest element left. Then, in a selection, if the current element is smaller than an element, and the small element appears after an element that is equal to the current element, then the stability will be destroyed after the exchange.

is a bit difficult to pronounce. For example, in the sequence 5 8 5 2 9, we know that selecting the 1th element 5 in the first pass will If exchanged with 2, then the relative order of 2 5 in the original sequence will be destroyed, so selection sorting is not a stable sorting algorithm.

3. Insertion sort

Insertion sort inserts one element at a time based on an already ordered small sequence. Of course, at the beginning, this small ordered sequence only had one element, which was the first element. The comparison starts from the end of the ordered sequence, that is, the element you want to insert is compared with the largest one that is already sorted. If it is larger than it, insert it directly behind it, otherwise keep looking forward until you find the element it should be inserted into. Location.

If you encounter an element that is equal to the inserted element, then the inserted element will place the element you want to insert after the equal element.

So, the order of equal elements has not changed. The order from the original unordered sequence is the sorted order, so insertion sort is stable.

4. Quick sort

Quick sort has two directions. The i subscript on the left goes all the way to the right. When a[i] <= a[center_index], where center_index is the array subscript of the center element, which is generally taken as the 0 element of the array. The j subscript on the right goes all the way to the left, when a[j]>a[center_index].

If both i and j can't walk, i<=j, exchange a[i] and a[j], repeat The above process until i>j. Swap a[j] and a[center_index] to complete a quick sort. When the central element is exchanged with a[j], it is very likely to disrupt the stability of the previous elements. For example, the sequence is 5 3 3 4 3 8 9 10 11, Now the exchange of the pivot elements 5 and 3 (the 5th element, the subscript starts from 1) will change the element 3 The stability is disrupted, so quick sort is an unstable sorting algorithm. The instability occurs at the moment when the central element and a[j] are exchanged.

5. Merge sort

Merge sort is to recursively divide the sequence into short sequences. The recursive exit is that the short sequence has only 1 elements (think directly ordered) or 2 sequences (1 comparisons and exchanges), and then merge each ordered segment sequence into an ordered long sequence, and continue to merge until the original sequence All sorted. It can be found that when there are 1 or 2 elements, 1 elements will not be exchanged, and 2 elements will not be exchanged if they are equal in size. No one is intentionally swapping, it doesn't destroy stability.

So, in the process of merging short ordered sequences, is stability destroyed?

No, during the merging process we can ensure that if the two current elements are equal, we will save the element of the previous sequence in front of the result sequence, thus ensuring stability. Therefore, merge sort is also a stable sorting algorithm.

6. Radix sorting

Radix sorting is sorting by low order first, and then collecting;

Then sorting by high order, and then collecting;

And so on, until the highest position. Sometimes some attributes have a priority order. They are sorted by low priority first, then by high priority. The final order is that those with high priority come first, and those with the same high priority and low priority come first. Radix sorting is based on separate sorting and separate collection, so it is a stable sorting algorithm.

7. Hill sorting (shell)

Hill sorting is to insert and sort elements according to different synchronization lengths. When the elements are very disordered at the beginning, The step size is the largest, so the number of elements in insertion sort is small and the speed is very fast.

When the elements are basically ordered and the step size is small, insertion sort is very efficient for ordered sequences. Therefore, the time complexity of Hill sorting will be better than O(n^2). Due to multiple insertion sortings, we know that one insertion sorting is stable and will not change the relative order of the same elements. However, in different insertion sorting processes, the same elements may move in their respective insertion sortings, and finally their stability will change. are scrambled, so shell sorting is unstable.

8. Heap sorting

We know that the structure of the heap is that the children of node i are 2 * i and 2 * i 1 node, the large top heap requires the parent node to be greater than or equal to its 2 child node, and the small top heap requires the parent node to be less than or equal to its 2 child node. In a sequence of length n, the heap sorting process is to select the largest (largest) value starting from n / 2 and its child nodes in total 3 Heap) or minimum (small top heap), the choice between these 3 elements will certainly not destroy stability. But when selecting elements for n / 2-1, n/2-2, ...1, the stability will be destroyed . It is possible that the n/2th parent node exchanges the following element, while the n/2-1th parent node does not exchange the following same element, then The stability between these two identical elements is destroyed. Therefore, heap sort is not a stable sorting algorithm.

For more related knowledge, please visit the FAQ column!

The above is the detailed content of What are the stable sorting algorithms?. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Latest Articles by Author
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template