Certainly, here's a stable sorting algorithm with O(n log n) time complexity suitable for both small
and large datasets:
Merge Sort
Divide and Conquer:
o Recursively divides the input array into two halves.
o Continuously divides each half until individual elements are reached.
Merge:
o Merges the sorted subarrays back together in a stable manner.
o Compares elements from each subarray and places the smaller element into a
temporary array.
o Ensures that elements with equal values maintain their original order.
Time Complexity:
o Best Case: O(n log n)
o Average Case: O(n log n)
o Worst Case: O(n log n)
Space Complexity: O(n) due to the temporary array used for merging.
Stability:
Maintains the original order of elements with equal values.
Why Merge Sort is Efficient:
Consistent Performance: O(n log n) time complexity regardless of the initial order of the
input data.
Suitable for Large Datasets: Efficiently handles large arrays due to its divide-and-conquer
approach.
Parallelism: Can be easily parallelized for improved performance on multi-core systems.
Note: While other sorting algorithms like Heap Sort also have O(n log n) time complexity,
Merge Sort generally offers better performance in practice, especially for large datasets.