PDSA Search Algo
PDSA Search Algo
Instructions:
• Read this study material carefully and make your own handwritten short notes. (Short
notes must not be more than 5-6 pages)
• Revise this material at least 5 times and once you have prepared your short notes, then
revise your short notes twice a week
• If you are not able to understand any topic or required detailed explanation,
please mention it in our discussion forum on webiste
• Let me know, if there are any typos or mistake in study materials. Mail
me at piyushwairale100@gmail.com
• Linear search is a method of finding elements within a list. It is also called a sequential
search. It is the simplest searching algorithm because it searches the desired element
in a sequential manner.
• It compares each and every element with the value that we are searching for. If both
are matched, the element is found, and the algorithm returns the key’s index position.
1.1.1 Algorithm:
1. Start from the beginning of the list.
4. If the end of the list is reached without finding the target, return a special value (e.g.,
-1) to indicate that the target is not in the list.
def l i n e a r s e a r c h ( a r r , t a r g e t ) :
for i in range ( len ( a r r ) ) :
i f a r r [ i ] == t a r g e t :
return i
return −1
1.1.3 Example:
my list = [2 , 4 , 7 , 1 , 9 , 6]
target value = 7
result = linear search ( my list , target value )
print ( f ” Index o f { t a r g e t v a l u e } : { r e s u l t }” )
• There are many searching algorithms but the binary search is most popular among
them.
• The elements in the list must be sorted to apply the binary search algorithm. If
elements are not sorted then sort them first.
• Binary search is an efficient search algorithm that finds the position of a target value
within a sorted array.
• In the binary search algorithm, we can find the element position using the following
methods:
1. Recursive Method
2. Iterative Method
• The divide and conquer approach technique is followed by the recursive method. In
this method, a function is called itself again and again until it found an element in the
list.
• Binary search is more effective than linear search because we don’t need to search each
list index. The list must be sorted to achieve the binary search algorithm.
1.2.1 Algorithm:
1. Start with the entire sorted array.
2. Set the lower bound to the first index and the upper bound to the last index.
5. If the target is less than the middle element, search in the left half. Otherwise, search
in the right half.
6. Repeat the process until the target is found or the bounds overlap.
def b i n a r y s e a r c h ( a r r , t a r g e t ) :
lower , upper = 0 , len ( a r r ) − 1
while l o w e r <= upper :
mid = ( l o w e r + upper ) // 2
i f a r r [ mid ] == t a r g e t :
return mid
e l i f a r r [ mid ] < t a r g e t :
l o w e r = mid + 1
else :
upper = mid − 1
return −1
1.2.3 Example:
s o r t e d l i s t = [1 , 2 , 4 , 6 , 7 , 9]
target value = 7
result = binary search ( sorted list , target value )
print ( f ” Index o f { t a r g e t v a l u e } : { r e s u l t }” )
• Insertion sort takes O(n2 ) time in the worst case. It is a fast inplace sorting algorithm
for small input sizes.
• Merge sort has a better asymptotic running time O(nlogn), but it does not operate in
inplace.
• Heap sort, sorts ‘n’ numbers inplace in O(nlogn) time, it uses a data structure called
heap, with which we can also implement a priority queue.
• Quick sort also sorts ‘n’ numbers in place, but its worst – case running time is O(n2 ).
Its average case is O(nlogn). The constant factor in quick sort’s running time is small,
This algorithm performs better for large input arrays.
• Insertion sort, merge sort, heap sort, and quick sort are all comparison-based sorts;
they determine the sorted order of an input array by comparing elements
• Selection Sort is another simple sorting algorithm. It divides the input list into a sorted
and an unsorted region. The algorithm repeatedly selects the smallest (or largest,
depending on the order) element from the unsorted region and swaps it with the first
element of the unsorted region. The process is repeated until the entire list is sorted.
• Basic idea: Find the minimum element in the unsorted portion of the array and swap
it with the first unsorted element. Repeat until the array is fully sorted.
• Advantages: Simple implementation, works well for small datasets, requires only
constant space, in-place sorting algorithm
2.1.1 Algorithm:
1. Find the minimum element in the unsorted part of the array.
def s e l e c t i o n s o r t ( a r r ) :
n = len ( a r r )
for i in range ( n − 1 ) :
min index = i
for j in range ( i + 1 , n ) :
i f a r r [ j ] < a r r [ min index ] :
min index = j
a r r [ i ] , a r r [ min index ] = a r r [ min index ] , a r r [ i ]
• Time complexity: O(n2 ) in the worst and average cases, O(n) in the best case (when
the input array is already sorted)
• Basic idea: Iterate through the array repeatedly, comparing adjacent pairs of elements
and swapping them if they are in the wrong order. Repeat until the array is fully sorted.
• Advantages: Simple implementation, works well for small datasets, requires only
constant space, stable sorting algorithm
2.2.1 Algorithm:
1. Compare each pair of adjacent elements in the array.
def b u b b l e s o r t ( a r r ) :
n = len ( a r r )
for i in range ( n ) :
for j in range ( 0 , n − i − 1 ) :
i f arr [ j ] > arr [ j + 1 ] :
arr [ j ] , arr [ j + 1] = arr [ j + 1] , arr [ j ]
• Time complexity: O(n2 ) in the worst and average cases, O(n) in the best case (when
the input array is already sorted)
• Basic idea: Build up a sorted subarray from left to right by inserting each new element
into its correct position in the subarray. Repeat until the array is fully sorted.
• Advantages: Simple implementation, works well for small datasets, requires only
constant space, efficient for partially sorted datasets, stable sorting algorithm
2.3.1 Algorithm:
1. Build a sorted array one element at a time.
2. Take each element and insert it into its correct position in the sorted array.
def i n s e r t i o n s o r t ( a r r ) :
n = len ( a r r )
for i in range ( 1 , n ) :
key = a r r [ i ]
j = i − 1
while j >= 0 and key < a r r [ j ] :
arr [ j + 1] = arr [ j ]
j −= 1
a r r [ j + 1 ] = key
– Breaking the problem into several sub-problems that are similar to the original
problem but smaller in size.
– Solve the sub-problem recursively (successively and independently)
– Finally, combine these solutions to sub-problems to create a solution to the orig-
inal problem
• Divide and conquer is a powerful algorithmic paradigm that involves breaking a prob-
lem into subproblems, solving them independently, and combining their solutions to
solve the original problem.
This technique can be divided into the following three parts:
Divide: This involves dividing the problem into smaller sub-problems.
Conquer: Solve sub-problems by calling recursively until solved.
Combine: Combine the sub-problems to get the final solution of the whole problem.
Divide-and-Conquer Examples
• Binary Search
3.1 Mergesort
Merge Sort is also a sorting algorithm. The algorithm divides the array into two halves,
recursively sorts them, and finally merges the two sorted halves.
10
3.1.1 Algorithm:
1. Divide: Divide the unsorted list into n sublists, each containing one element.
2. Conquer: Repeatedly merge sublists to produce new sorted sublists until there is only
one sublist remaining.
def m e r g e s o r t ( a r r ) :
i f len ( a r r ) > 1 :
mid = len ( a r r ) // 2
l e f t h a l f = a r r [ : mid ]
r i g h t h a l f = a r r [ mid : ]
merge sort ( l e f t h a l f )
merge sort ( r i g h t h a l f )
i = j = k = 0
11
3.2 Quicksort
Quicksort is a sorting algorithm. The algorithm picks a pivot element and rearranges the
array elements so that all elements smaller than the picked pivot element move to the left
side of the pivot, and all greater elements move to the right side. Finally, the algorithm
recursively sorts the subarrays on the left and right of the pivot element.
3.2.1 Algorithm:
1. Divide: Choose a pivot element and partition the array into two subarrays such that
elements smaller than the pivot are on the left, and elements greater than the pivot
are on the right.
2. Conquer: Recursively apply quicksort to the subarrays.
3. Combine: The final result is a sorted array.
def q u i c k s o r t ( a r r ) :
i f len ( a r r ) <= 1 :
return a r r
p i v o t = a r r [ len ( a r r ) // 2 ]
l e f t = [ x for x in a r r i f x < p i v o t ]
middle = [ x for x in a r r i f x == p i v o t ]
r i g h t = [ x for x in a r r i f x > p i v o t ]
return q u i c k s o r t ( l e f t ) + middle + q u i c k s o r t ( r i g h t )
3.3 Comparison:
• Mergesort:
– Stable.
– Time Complexity: O(n log n) in all cases.
– Space Complexity: O(n) additional space.
• Quicksort:
– Unstable.
– Time Complexity: O(n log n) average case, O(n2 ) worst case (with poor pivot
selection).
– Space Complexity: O(log n) additional space.
12
References
• geeksforgeeks.org/
• javatpoint.com/
• w3schools.com/
• tutorialspoint.com/
• byjus.com
13