Quick Sort Analysis

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 11

SORTING ALGORITHMS

19CSE302 -Design and Analysis of Algorithms

Dharun M -CB.EN.U4CSE22110
Vijay H Anand -CB.EN.U4CSE22156

PROBLEM STATEMENT:
Commence the initial stage by coding the core sorting algorithms, namely:  In-
Place Quick Sort (Pick the pivot as the last element) 3-Way Merge Sort  In-
Place Heap Sort  Bucket Sort  Radix Sort (For this, the input is a linked list
and its max. size is 25, in which one node of the linked list possess exactly one
element) Afterward, create a series of random input test cases in diverse sizes,
starting with a minimum of 100, and including 500 and 1000. Proceed to assess
the efficiency of these algorithms using the criteria listed below:  Number of
comparisons (where applicable)  Number of swaps (where applicable) 
Number of basic operations (other than the ones mentioned above)  Execution
time in milliseconds Memory usage Upon completing this evaluation, articulate
your findings and deduce conclusions.

Q1) IN-PLACE QUICK_SORT ANALYSIS


In-place quicksort is a version of quicksort that sorts numbers within
the array itself, while the original quicksort requires copying numbers
into other arrays.

1)Number of comparisons:
 The partition function is where most of the comparisons happen in the Quick
Sort algorithm.
 The if arr[j] <= pivot: condition checks whether each element should be moved
to the left side of the pivot.
 Each evaluation of this condition counts as a comparison, which is tracked by
incrementing the self.comparisons counter.

This process of partitioning and comparison is repeated recursively for sub-


arrays, gradually leading to the full sorting of the array.
2) No of swaps:
Swaps happen in the partition function, which is responsible for
rearranging elements around the pivot.First Swap Condition (arr[i], arr[j] = arr[j],
arr[i]),Final Swap of Pivot (arr[i+1], arr[high] = arr[high], arr[i+1]).

Reference: https://stackoverflow.com/questions/70066586/maximum-and-minimum-
amont-of-swaps-for-quicksort-algoritm.

3) No of basic operations:
Index Updates and Assignments:

 Incrementing i (i += 1) is a basic operation.


 Assigning the pivot (pivot = arr[high]) is also considered a basic operation.
 Incrementing the loop variable j (in for j in range(low, high):) and the loop itself
could also be considered basic operations, although these are generally not
counted separately in algorithm analysis.

4) Execution Time:
The simplest way to find execution time in Python:

import time
start_time = time.time()
main()
print("--- %s seconds ---" % (time.time() - start_time))
TIME COMPLEXITY:
Best Case (O (n log n)): The best-case scenario occurs when the pivot element
divides the array into two nearly equal halves each time.

Average Case (O (n log n)): On average, the pivot will divide the array into
two sub-arrays of roughly equal size.

Worst Case (O(n^2)): The worst-case scenario occurs when the pivot element is
the smallest or largest element each time.

SPACE COMPLEXITY:
AVERAGE CASE: O(1):Quick Sort is an in-place sorting algorithm, meaning it doesn’t
require additional space proportional to the input size for the sorting process itself.

WORST CASE: O(1)

Q2) 3-WAY MERGE_SORT ANALYSIS


A 3-way merge sort is a variant of the traditional merge sort algorithm that
divides the array into three subarrays, rather than the usual two, and then recursively
sorts each of these subarrays. The sorted subarrays are then merged together into a
single sorted array.

1) Number of comparisons:

·
In the merge function:

 Initial Comparison Block: The while loop compares elements from all three
subarrays (left, middle, right). Each comparison made here is counted by
self.comparisons += 1.
 Subsequent Comparison Blocks: There are additional while loops to merge
any remaining elements from the left, middle, and right subarrays into the main
array. Each comparison within these loops is also counted.

Counting Mechanism:

 Comparisons are counted every time an if statement is evaluated to compare


elements from the subarrays.

2) No of Swaps:
 Number of Swaps: 0 in a merge sort (including your 3-way merge sort).
 Number of Moves: The number of moves (or copies) is equal to the
number of elements being sorted because each element is moved exactly
once during the merge process.

3) No of basic operations:
 Array Accesses: Each time an element from one of the subarrays is
copied into the merged array.
 Merge Operations: The process of placing the minimum element (from
the left, middle, or right subarrays) into the final merged array.
 Splitting Operations: Dividing the array into three subarrays
recursively.
 Final Insertion: Placing remaining elements from any subarray that
hasn't been fully traversed into the merged array.

4) EXECUTION TIME:
The simplest way to find execution time in Python:

import time
start_time = time.time()
main()
print("--- %s seconds ---" % (time.time() - start_time))
TIME COMPLEXITY:

Given that there are log_3 n levels of recursion (where nnn is the number of elements
in the array), and each level involves a merging process that takes O(n)O(n)O(n) time,
the overall time complexity can be summarized as:

T(n)=O(nlog_3 n) T(n)=O(nlog_3 n)

Because log_b n can be converted to log_k n using a constant factor, the base of the
logarithm is not critical in Big-O notation. Thus, it simplifies to:

T(n)=O(nlogn)

Space Complexity:
The maximum depth of recursion is log_3 n, which translates to O(log n) in Big-O
notation.
Q3) IN_PLACE HEAP SORT:
In-Place Heap Sort is a variant of the Heap Sort algorithm that sorts
elements directly within the original array, rather than using additional
space for a heap structure.

1) Number of comparisons:
In total, for each call to heapify, there are two primary comparisons: one for
the left child and one for the right child. These comparisons determine the largest
element among the parent and its children to maintain the max-heap structure.

For each element in the array, heapify is called to ensure the remaining elements
satisfy the max-heap property.

The number of comparisons in each call to heapify will vary depending on the
structure of the heap, but it always involves checking the left and right children as
described above.

2) No of Swaps:
A swap occurs when the current node (initially assumed to be the largest) is found to
be smaller than one of its children. Specifically:

 If the left child is greater than the current node, a swap might occur.
 If the right child is greater than the largest so far (which could be the left child
or the current node), another swap might occur.

After swapping, the heapify function is recursively called on the affected subtree,
which might result in additional swaps if the max-heap property is not yet
restored.The sorting process begins by repeatedly extracting the root of the heap (the
maximum element) and swapping it with the last element in the heap. This places the
maximum element at the end of the array, effectively reducing the heap size by
one.Swaps directly reflect the amount of work done to reorder elements in the array,
which in turn affects the algorithm's overall performance.

3) No of basic operations:
 Recursive Operations: The recursive nature of heapify means that a single
call can trigger additional operations further down the heap.

By analyzing these components, you can estimate the total number of basic
operations for any given input size. The actual counts would vary depending on the
input, but the worst-case scenario generally follows the O(n log n) complexity for
comparisons and swaps.
4) EXECUTION TIME:

TIME COMPLEXITY:
Overall Time Complexity:

 The time complexity for building the heap is O(n), and the time complexity for
the sorting phase is O(n log n).
 Therefore, the overall time complexity of the heap sort algorithm is O(n log n).

SPACE COMPLEXITY:
Overall Space Complexity:

 The overall space complexity of heap sort is O(1) for the in-place sorting part.
Q4) BUCKET_SORT
Bucket sort is a sorting algorithm that separate the elements into multiple
groups said to be buckets. Elements in bucket sort are first uniformly divided
into groups called buckets, and then they are sorted by any other sorting
algorithm. After that, elements are gathered in a sorted manner.

1) NO_OF_COMPARISONS:
The total number of comparisons in Bucket Sort is mainly influenced by:

 The number of elements in each bucket.


 The efficiency of the sorting algorithm used within each bucket.
 The distribution of elements across buckets.

2) NO_OF_SWAPS:
 Swaps during Bucket Sort: Occur within the insertion sort phase for each
bucket.
 Impact on Performance: The number of swaps is affected by the distribution
of elements in the buckets and the efficiency of the sorting algorithm used
within each bucket.

Swaps are a crucial aspect of the bucket sort’s efficiency, especially when dealing
with the sorting algorithm within the buckets.

3) NO_OF_BASIC_OPERATIONS:
 Bucket Distribution: Primarily involves assignments and index calculations.
 Sorting Buckets: Depends on the internal sorting algorithm. For insertion sort,
this includes comparisons, swaps, and assignments.
 Concatenation: Involves assignments to merge the sorted elements from
buckets into the final array.
4) EXECUTION_TIME:

Time Complexity:

 Best Case: O(n)


 Average Case: O(n+klogk)
 Worst Case: O(n^2)

·Space Complexity: O(n)

Q5) RADIX_SORT
Radix Sort works by processing each digit of the numbers from the least
significant digit (LSD) to the most significant digit (MSD) or vice versa. The
sorting is performed in multiple passes, where each pass sorts the numbers
based on a single digit.

1) NO_OF_COMPARISONS:

Radix Sort does not rely on comparisons to sort elements. Instead, it uses a
digit-based approach with counting and bucketing operations, which makes it a non-
comparative sorting algorithm. Therefore, the "number of comparisons" in Radix Sort
is zero, as it completely avoids direct comparisons between elements.

2) NO_OF_SWAPS:

In summary, Radix Sort does not involve any swaps as part of its sorting process.
Instead, it sorts by placing elements into buckets or directly into their correct
positions based on digit values, with no need for swapping. Therefore, the "number of
swaps" in Radix Sort is zero.

3) NO_OF_BASIC_OPERATIONS:

The number of basic operations in Radix Sort is directly proportional to the


product of the number of elements (n) and the number of digits (d). These operations
include extracting digits, distributing elements into buckets, and gathering them back
into the sorted array. The total number of basic operations is O(n * d), making Radix
Sort efficient for sorting large datasets with a fixed range of digit values.

4) EXECUTION_TIME:

Time Complexity:
 O(d×n), where:
o n is the number of elements.
o d is the maximum number of digits in the largest element.

Space Complexity:
O(n)

Where: n is the number of elements in the linked list.

You might also like