0% found this document useful (0 votes)
7 views4 pages

Algorithms

The document provides a comprehensive list of algorithms categorized into searching, sorting, graph, array, tree, and other algorithms, along with their time complexities. Each algorithm is briefly described, highlighting its primary function and performance characteristics. This serves as a quick reference for understanding various algorithms and their efficiency in computational tasks.

Uploaded by

codebyanii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views4 pages

Algorithms

The document provides a comprehensive list of algorithms categorized into searching, sorting, graph, array, tree, and other algorithms, along with their time complexities. Each algorithm is briefly described, highlighting its primary function and performance characteristics. This serves as a quick reference for understanding various algorithms and their efficiency in computational tasks.

Uploaded by

codebyanii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Here's the rearranged list of algorithms, including an additional note about the time complexity

for each algorithm:

Searching Algorithms:

1. Depth First Search (DFS): Explores as far as possible along each branch before
backtracking.
Complexity: O(V + E) where V is vertices and E is edges.
2. Breadth First Search (BFS): Explores all neighbors at the present depth level before
moving on to the next level.
Complexity: O(V + E).
3. Binary Search: Searches a sorted array by repeatedly dividing the search interval in
half.
Complexity: O(log n).
4. Linear Search: Sequentially checks each element of the array.
Complexity: O(n).
5. Rabin-Karp Algorithm: A string search algorithm using hashing to find any one set of
pattern strings in a text.
Complexity: O(n + m), where n is the length of the text and m is the length of the pattern
(best case), O(n*m) in the worst case.
6. Z Algorithm: Finds occurrences of a pattern in a string in linear time by calculating the
Z-array.
Complexity: O(n).

Sorting Algorithms:

1. Quick Sort: A divide-and-conquer algorithm; selects a pivot element and partitions the
array around the pivot.
Complexity: O(n log n) average, O(n²) worst case.
2. Merge Sort: Divides the array into halves, sorts them, and then merges them.
Complexity: O(n log n).
3. Heap Sort: Builds a max heap and then repeatedly extracts the maximum element to
sort the array.
Complexity: O(n log n).
4. Insertion Sort: Builds the sorted array one element at a time by comparing each new
element with the existing sorted elements.
Complexity: O(n²).
5. Selection Sort: Repeatedly selects the smallest element from the unsorted part of the
list and swaps it with the first unsorted element.
Complexity: O(n²).
6. Counting Sort: Non-comparative sorting that counts the frequency of each element and
uses it to sort the array.
Complexity: O(n + k), where k is the range of the input.
7. Radix Sort: Sorts numbers digit by digit, using a stable sub-sorting algorithm like
counting sort.
Complexity: O(d(n + k)), where d is the number of digits.
8. Shell Sort: Generalization of insertion sort that allows the exchange of far-apart
elements.
Complexity: O(n log² n) to O(n³/²) depending on the gap sequence.
9. Bubble Sort: Repeatedly swaps adjacent elements if they are in the wrong order.
Complexity: O(n²).
10. Bucket Sort: Distributes elements into several buckets and then sorts each bucket
individually.
Complexity: O(n + k), where k is the number of buckets.
11. Comb Sort: Improves bubble sort by comparing elements farther apart.
Complexity: O(n log n) best, O(n²) worst case.
12. Cycle Sort: Used for minimizing the number of writes when sorting.
Complexity: O(n²).
13. Pigeonhole Sort: Efficient for sorting lists where the number of elements and range are
approximately the same.
Complexity: O(n + range).

Graph Algorithms:

1. Dijkstra's Algorithm: Finds the shortest paths between nodes in a graph with
non-negative edge weights.
Complexity: O((V + E) log V) using a priority queue.
2. Bellman-Ford Algorithm: Calculates shortest paths from a single source node, capable
of handling negative weights.
Complexity: O(VE).
3. Floyd Warshall Algorithm: Finds shortest paths between all pairs of nodes in a graph.
Complexity: O(V³).
4. Kruskal's Algorithm: Finds the minimum spanning tree by sorting edges and adding
them to the spanning tree.
Complexity: O(E log V).
5. Prim's Algorithm: Another algorithm to find a minimum spanning tree but based on
growing the tree from a starting node.
Complexity: O(E log V) using a priority queue.
6. Boruvka's Algorithm: Finds the minimum spanning tree in a graph, suitable for parallel
execution.
Complexity: O(E log V).
7. Kosaraju's Algorithm: Used to find strongly connected components in a directed graph.
Complexity: O(V + E).
8. Tarjan's Algorithm: Finds strongly connected components in a directed graph and is
based on DFS.
Complexity: O(V + E).
9. Topological Sort: Orders vertices in a directed acyclic graph (DAG).
Complexity: O(V + E).
10. Flood Fill Algorithm: Fills a connected component in a grid or image.
Complexity: O(n), where n is the number of cells.
11. Lee Algorithm: Finds the shortest path in a 2D grid using BFS.
Complexity: O(n), where n is the number of cells.

Array Algorithms:

1. Kadane's Algorithm: Finds the maximum sum subarray in O(n) time.


Complexity: O(n).
2. Knuth-Morris-Pratt Algorithm (KMP): String searching algorithm that avoids
re-evaluation of characters.
Complexity: O(n + m).
3. Floyd's Cycle Detection Algorithm: Also known as the tortoise and hare algorithm,
detects cycles in linked lists.
Complexity: O(n).
4. Quick Select Algorithm: Finds the kth smallest/largest element in an unsorted array.
Complexity: O(n) average, O(n²) worst case.
5. Boyer-Moore Majority Vote Algorithm: Finds the majority element in a sequence.
Complexity: O(n).

Tree Algorithms:

1. Fibonacci Heap: Used in network optimization algorithms like Dijkstra’s and Prim’s.
Complexity: O(log n) for most operations, amortized O(1) for decrease-key.
2. Binary Indexed Tree (Fenwick Tree): Efficient for range queries and updates.
Complexity: O(log n).
3. Suffix Tree: Used for string matching and substring queries.
Complexity: O(n) to build, O(m) for substring queries.
4. AA Tree: A balanced binary search tree.
Complexity: O(log n).
5. Interval Tree: Used to query all intervals that overlap with a given range.
Complexity: O(log n) for insertion and query.
6. Van Emde Boas Tree: Data structure that supports predecessor/successor queries in
O(log log n).
Complexity: O(log log n).
7. Quadtree: Used in spatial indexing for dividing 2D spaces.
Complexity: O(n log n).
8. Cartesian Tree: Combination of binary search tree and heap properties.
Complexity: O(n) to build, O(log n) for queries.
9. Finger Tree: Amortized O(1) operations for elements near the ends.
Complexity: O(log n) for general operations.
10. Splay Tree: Self-adjusting binary search tree.
Complexity: O(log n) amortized.
11. Scapegoat Tree: Self-balancing binary search tree.
Complexity: O(log n).
12. Counted B-Trees: B-trees with additional counters for each node.
Complexity: O(log n).
13. Binary Space Partitioning: Used in computational geometry for partitioning space.
Complexity: O(n log n).

Other Algorithms:

1. Huffman Coding: Greedy algorithm for data compression.


Complexity: O(n log n) for constructing the tree.
2. Union Find Algorithm: Disjoint-set data structure for union and find operations.
Complexity: O(α(n)) (inverse Ackermann function) per operation.
3. Euclid's Algorithm: Efficiently finds the greatest common divisor (GCD).
Complexity: O(log(min(a, b))).
4. Manacher's Algorithm: Finds the longest palindromic substring in linear time.
Complexity: O(n).
5. Eulerian Path (Hierholzer's Algorithm): Finds an Eulerian path in a graph.
Complexity: O(V + E).
6. Johnson’s Algorithm: Finds all-pairs shortest paths using Bellman-Ford and Dijkstra.
Complexity: O(V² log V + VE).

You might also like