Data Structures and Algorithms
Data Structures and Algorithms
Data Structures and Algorithms
Data Structures
Introduction
Stack and Queue / Slide 2
Overview
The Lecturer
Course structure
Course Aims / Objectives
Topics
Assessment
Course Policies
Stack and Queue / Slide 3
Lecturer’s Information
Dr. Phillip Kisembe
Mobile 0552215093
email drpkisembe@anuc.edu.gh
Stack and Queue / Slide 4
The class will meet for three hours every week (see Time
table).
Stack and Queue / Slide 7
Indicative Content
Data Structures:
Arrays, Lists, Trees, Stacks, Queues, Heaps, Self balancing trees, Graphs and Hash
tables
Algorithms:
Searching and sorting, Recursion, Divide and Conquer strategies, Greedy algorithms,
Prim’s algorithm, Dijkstra’s algorithm, Bellman-Ford algorithm
Complexity and efficiency:
Time and space complexity, Big-O notation, Analysis of algorithms
Software reuse and advanced language features:
Encapsulation, Data hiding, Polymorphism, Exception handling, Events, GUIs.
Stack and Queue / Slide 8
Course Requirements
Component Weighting Learning Outcomes
Assessment
2 3 4
Essential Reading
McGrath, M. (2011) C++ Programming in
Easy Steps. 4th ed. Southam: In Easy Steps
Recommended Reading
Cormen, Thomas H. (2009) Introduction to
Algorithms. 3rd ed. Cambridge, Mass: MIT
Press
Stack and Queue / Slide 10
Course Policies
Class Participation:
Preparation and engaged participation at all class
sessions are expected of all students.
Solution
Problem Computer Program
Data Structures+Algorithms=Programs
Stack and Queue / Slide 12
Data Structures
Data may be organized in different ways
Array
Linked List
Queue
Tree
Stack
Stack and Queue / Slide 15
Data Structures
Data can be organized into:
Primitive types
Composite types
Abstract data types
Stack and Queue / Slide 16
Sorting
Arrange the records in a logical order
Merging
Combining records from two or more files or data structures into one
Data Structures and Algorithms
Program Efficiency & Complexity Analysis
of Algorithms
Stack and Queue / Slide 23
What is an Algorithm?
An algorithm is a definite procedure for solving a
problem in finite number of steps
Algorithm
An algorithm is a set of instructions to be followed
to solve a problem.
There can be more than one solution (more than one
algorithm) to solve a given problem.
An algorithm can be implemented using different
programming languages on different platforms.
An algorithm must be correct. It should correctly
solve the problem.
e.g. For sorting, this means even if (1) the input is
already sorted, or (2) it contains repeated elements.
Once we have a correct algorithm for a problem,
we have to determine the efficiency of that
algorithm.
CENG 213 Data Structures 24
Stack and Queue / Slide 25
Favorite Algorithms
Takes less memory (Space Efficient)
Smaller execution time
Smaller programming time
Time complexity (most important)
Stack and Queue / Slide 26
Efficient Algorithms
Time
Measuring Efficiency
The efficiency of an algorithm is a measure of
the amount of resources consumed in solving a
problem of size n.
Measuring Efficiency
It would seem that the most obvious way to
measure the efficiency of an algorithm is to
run it and measure how much processor time
is needed
Is it correct??
Stack and Queue / Slide 29
Processor speed
Processor type
Programming language
Quality of compiler
Size and nature of input
Operating system
Stack and Queue / Slide 30
Theoretical Analysis
Uses a high-level description of the algorithm
instead of an implementation
Analyzing an Algorithm
Finding running time of an Algorithm
Analysis of Algorithms
N = 10 => 53 steps
N = 100 => 503 steps
N = 1,000 => 5003 steps
N = 1,000,000 => 5,000,003 steps
Asymptotic Complexity
The 5N+3 time bound is said to "grow
asymptotically" like N
Time complexity
In computer science, the time complexity of an algorithm
quantifies the amount of time taken by an algorithm to run
as a function of the size of the input to the problem.
Time Complexity
For example, if the time required by an algorithm on all inputs
of size n is at most 5n3 + 3n, the asymptotic time complexity is
O(n3).
Time Complexity
Since an algorithm may take a different amount
of time even on inputs of the same size, the
most commonly used measure of time
complexity, the worst-case time complexity of
an algorithm, denoted as T(n), is the maximum
amount of time taken on any input of size n.
Big O Notation
Big O Notation
describes the limiting behavior of a function
when the argument tends towards a particular
value or infinity, usually in terms of simpler
functions.
Informally, this means that for large enough input sizes the
running time increases linearly with the size of the input.
Logarithmic time
O(log N)
Logarithmic time
O(log N).
Big-Oh Rules
If is f(n) a polynomial of degree d, then f(n) is
O(nd), i.e.,
1. Drop lower-order terms
2. Drop constant factors
Example: 2n + 10 is O(n)
2n + 10 ≤ cn
(c − 2) n ≥ 10
n ≥ 10/(c − 2)
Pick c = 3 and n0 = 1
Stack and Queue / Slide 59
Big-Oh Example
the function n2 is not O(n)
n2 ≤ cn
n
≤c
The above inequality cannot be satisfied
since c must be a constant
Stack and Queue / Slide 61
3n3 + 20n2 + 5
3n3 + 20 n2 + 5 is O(n3)
need c > 0 and n0 ≥ 1 such that 3n3 + 20n2 + 5 ≤ c*n3
for n ≥ n0
this is true for c = 4 and n0 = 21
Stack and Queue / Slide 62
Growth-Rate Functions
O(1) Time requirement is constant, and it is independent of the problem’s size.
O(log2n) Time requirement for a logarithmic algorithm increases increases
slowly
as the problem size increases.
O(n) Time requirement for a linear algorithm increases directly with the size
of the problem.
O(n*log2n) Time requirement for a n*log2n algorithm increases more rapidly than
a linear algorithm.
O(n2) Time requirement for a quadratic algorithm increases rapidly with the
size of the problem.
O(n3) Time requirement for a cubic algorithm increases more rapidly with the
size of the problem than the time requirement for a quadratic algorithm.
O(2n) As the size of the problem increases, the time requirement for an
exponential algorithm increases too rapidly to be practical.
Growth-Rate Functions
If an algorithm takes 1 second to run with the problem size 8,
what is the time requirement (approximately) for that algorithm
with the problem size 16?
If its order is:
O(1) T(n) = 1 second
O(log2n) T(n) = (1*log216) / log28 = 4/3 seconds
O(n) T(n) = (1*16) / 8 = 2 seconds
O(n*log2n) T(n) = (1*16*log216) / 8*log28 = 8/3 seconds
O(n2) T(n) = (1*162) / 82 = 4 seconds
O(n3) T(n) = (1*163) / 83 = 8 seconds
O(2n) T(n) = (1*216) / 28 = 28 seconds = 256 seconds
Any Questions?