We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15
Introduction to
Data Structures Data Structures • Data structures are ways of organizing and storing data in computer memory so that they can be accessed and modified efficiently.
• They are essentially containers that hold data in a specific layout.
• Examples include arrays, linked lists, stacks, queues, trees, and
graphs. Data Structures - Applications • Arrays: Used in applications that require indexed access to elements, such as image processing where each pixel needs to be accessed using an index (or indices). • Stacks: Used in undo functionality in software like text editors or going back in a web browser, etc. • Queues: Used in task scheduling systems. • Trees: Used in databases and file systems to store hierarchical data. • Graphs: Used in social networks to represent connections between users and in GPS (or navigation systems) to find the shortest path. Efficiency of Data Structures • Data structures are ways of organizing and storing data to facilitate efficient operations. • The efficiency of a data structure is evaluated based on how quickly and effectively it supports various operations like accessing, inserting, deleting, updating, and searching for data. • Typically, efficiency is divided into • Time complexity • Space complexity Efficiency of Data Structures • Time Complexity: Measures how the time to perform operations changes with the size of the data. • Example • Array: Accessing an element by index is O(1) (constant time), but inserting an element in the beginning can be O(n) (linear time) due to shifting. Efficiency of Data Structures • Space Complexity: Measures the amount of memory used by a data structure as it scales with the size of the data it contains. • Examples • Array: An array with n elements typically has a space complexity of O(n), as the memory usage grows linearly with the number of elements. • Linked List: A linked list generally has a space complexity of O(n) because each node requires some memory to store the data and a pointer to the next node. Algorithm • An algorithm is a well-defined, step-by-step procedure or set of rules for solving a specific problem or performing a specific task. • Examples • Searching a value from an array or list • Sorting an array • Finding the shortest path from source to destination, etc. Analysis of Algorithms • To go from the city “A” to city “B”, there can be many ways of accomplishing this: by flight, by bus, by train, and also by bicycle. • Depending on the availability and convenience, we choose the one that suits us. • Similarly, in computer science, multiple algorithms are available for solving the same problem (for example, a sorting problem has many algorithms, like insertion sort, selection sort, quick sort, and many more). • Algorithm analysis helps us to determine which algorithm is most efficient in terms of time and space consumed. Analysis of Algorithms • In most of the cases, the goal of the analysis of algorithms is to compare algorithms (or solutions) mainly in terms of running time (in some cases, there may be other factors like space/memory, etc.) Running Time Analysis • It is the process of determining how processing time increases as the size of the problem (input size) increases. • Input size is the number of elements in the input, like: • Size of an array • Number of elements in a matrix • Vertices and edges in a graph, etc. How to Compare Algorithms • Should we use: • Execution times? Not a good measure as execution times are specific to a particular computer. • Number of statements executed? Not a good measure, since the number of statements varies with the programming language as well as the style of the individual programmer. • Ideal solution: We measure how the basic operations (like comparisons, assignments, additions, memory accesses, etc.) grow as the input size (n) grows. This kind of comparison is independent of machine time, programming style, etc. Rate of growth • The rate at which the running time increases as a function of input is called the rate of growth. • Some commonly used Rates of Growth Time Complexity Name Example 1 Constant Inserting an element at the end of an array log n Logarithmic Finding/searching a value from a sorted array n Linear Finding/searching a value from an unsorted array n log n Linear Logarithmic Sorting an array using ‘Divide-and-Conquer’ like MergeSort or QuickSort, etc. 𝑛! Quadratic Simple sorting like BubbleSort or finding shortest path between two nodes of a graph 𝑛" Cubic Matrix multiplication Asymptotic Analysis • Asymptotic analysis is used to describe the behavior of an algorithm as the input size approaches infinity. • It provides a high-level understanding of the algorithm's efficiency by ignoring constant factors and lower-order terms, focusing instead on the most significant terms. • There are three notations commonly used for asymptotic analysis • Big O: Denoting the “at most” or worst-case complexity • Big Omega: Denoting the “at least” or best-case complexity • Big Theta: Denoting the “in between” or average-case complexity Asymptotic Analysis • However, we usually use Big O notation to focus on the worst-case scenarios. • For example, if an algorithm’s running time is 𝑛! + 100𝑛" + 10𝑛 + 50,
• Then we will consider 𝑛! as the Big O complexity and discard the
remaining components because 𝑛! gives the maximum rate of growth for larger values of n. Common operations on Data Structures • Some of the common operations performed on some data structures are: • Insertion • Access • Search • Deletion • Updation • Traversal • Sorting