Algorithms and Data Structures: Counting Sort and Radix Sort
Algorithms and Data Structures: Counting Sort and Radix Sort
Algorithms and Data Structures: Counting Sort and Radix Sort
O(n + m).
O(n + m).
Note: This does not contradict Theorem 3 from Lecture 7 - that’s a result
about the general case, where keys have an arbitary size (and need not
even be numeric).
O(n + m).
Note: This does not contradict Theorem 3 from Lecture 7 - that’s a result
about the general case, where keys have an arbitary size (and need not
even be numeric).
Note: Counting-Sort is STABLE.
I (After sorting, 2 items with the same key have their initial relative
order).
Examples
I Counting-Sort, Merge-Sort, and Insertion Sort are all
stable.
I Quicksort is not stable.
I If keys and elements are exactly the same thing (in our setting, an
element is a structure containing the key as a sub-element) then we
have a much easier (non-stable) version of Counting-Sort.
(How? ... CLASS?).
Each of the three sorts is carried out with respect to the digits in that
column. “Stability” (and having previously sorted digits/suffixes to the
right), means this achieves a sorting of the suffixes starting at the current
column.
ADS: lect 9 – slide 9 – 8th February, 2016
Radix Sort (cont’d)
Algorithm Radix-Sort(A, d)
1. for i ← 0 to d do
2. use stable sort to sort array A using digit i as key
Most commonly, Counting Sort is used in line 2 - this means that once
a set of digits is already in sorted order, then (by stability) performing
Counting Sort on the next-most significant digits preserves that order,
within the “blocks” constructed by the new iteration.
Then each execution of line 2 requires time Θ(n + R).
Thus the overall time required by Radix-Sort is
Θ(d(n + R))
Θ(ndb/ lg ne)
Problems