0% found this document useful (0 votes)
11 views

Algorithms Everything

This document summarizes and compares various algorithms and data structures. It discusses fundamental algorithmic analysis techniques like asymptotic analysis and recursive formulas. It also covers common data structures like stacks, queues, trees, heaps, tries, and hashes. Key sorting algorithms like bubble sort, selection sort, insertion sort, shell sort, merge sort, and quick sort are outlined. Fundamental search algorithms like sequential search, binary search, and hashing are also summarized. Finally, it briefly mentions graph algorithms like topological sort, Prim's algorithm, and Fibonacci heap operations.

Uploaded by

francisco reales
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Algorithms Everything

This document summarizes and compares various algorithms and data structures. It discusses fundamental algorithmic analysis techniques like asymptotic analysis and recursive formulas. It also covers common data structures like stacks, queues, trees, heaps, tries, and hashes. Key sorting algorithms like bubble sort, selection sort, insertion sort, shell sort, merge sort, and quick sort are outlined. Fundamental search algorithms like sequential search, binary search, and hashing are also summarized. Finally, it briefly mentions graph algorithms like topological sort, Prim's algorithm, and Fibonacci heap operations.

Uploaded by

francisco reales
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

Organization :

/ Master /
Algorithmic Analysis : •

Asymptotic s theorem
get formula
recursive
/

Data structures :
from
algorithm .

Fundamental Questions : use ? insertion ,


deletion ,
traversing algorithms ?
Worst best, , avg case for each ? what are they best for ?
• stack ✓

Queues ✓

÷:

Trees → -

Binary trees ✓

{
Binary search trees
-

/

AVL
/
2-3 trees

[
-

••• /
heaps ( Priority Queues )
-

Tries / Huffman trees ✓



Hash
/

Sorting Algorithms :

Fundamental Questions : stable in Place what is it best


,
-

,
for , when doesn't it work
,

best worst , avg complexity ?


,


Bible sort ✓
selection /

sort
/
Additional :

insertion sort Closest Pair ( brute force ]
shell sort /
/

knapsack ( brute force ) -

Dynamic programing
merge sort Fake ( decrease

/
-

coin and
conquer ]

Quick sort
.
.
. + , /

sorting by counting

Search Algorithms :

Fundamental Questions : what is it best for , when doesn't it work


,

best ,
worst
, avg complexity ?

/

sequential search
/

Binary search
/

•( trees
search on BST

Binary search AVL { and 2- 3

/

on

hashing in -
th smallest elements .

Graph algorithms
Fibonacci with String Search
different techniques

topological ✓ Prim /

Brute force

}
• •

sort
Greedy

warshatgpynamic
Hors pool
p 514 Strq algorithms

.

'

.
,

/ Programmes •

Rabin
Floyd Karp
-


Asymptotic Analysis :

The assumption is that the time to run is proportional to the number OF iterations the basic operation
is executed in '

Big On notation : functions


set of all no
faster
)
find the upper bound 0cg that grow) less

_

, ,, , , or
than
gen
equal
( grow

Big R that
set of functions the
find the lower bound Regan, , →
slower
or

Gcn)
,

Grow than
same

Big ∅
② Caen ) )
When
upper and lower bound the same
asymptotically
ten ) C- if
{ tens C- Ocgai

are ,

ten C- Rosen :
Definition with limits :

↳ : :::: ::÷÷ } }
n!≈FÉÑ(n
ten ) E Olesen ' ) useful formula :
t.im tag = c t ,
9 have the same order

(
a-• •
9cm . ten , ← reason ]

• ten ] C- corns )

Recursive Formula

Function 3. Search ( Aso ,


. .

>
n -13,1h )
if n = 0 then

if
return false
n= 1 then
/ if basic
the comparisons
operations are

return A- [ 0 ] ==m

return 3 search (A[0


-
.in/3-D)0R3se-archA...2Y---D0R3Search(AL2Y3/.-yn-P-#
,

many
currejftt.ve iteration
how function " BO ,
""
¥¥¥ʰ "
each operation
avg.FI would
www.,e+urMʳᵈ
1cowpa
make"
n=0 " " ,

c_¥É¥¥

twice if
+
rep ↑ basic
axed →
⑤ (ng ,
2
and

+ ,, Go, = , perform
comparisons
=
/

CCH if u
=D
result .

✓ e¥k 2 →
=

return
and
function
of the ~
array
is called in the subsequent base case
times iteration
three
.

Arry length
ng-µ -1093
=

A- [0 .in -1 ]
(n )
.

,
= ʰ
hast
↓ /

last
14=0%2
] -7%5] ]
""

1093h
"
/
-143
"

Agents ' Aging


'

Ago ,
-

[ 14=1*-231 / /
"

C. ( n )
=
2.3
/
14=0

=
3h -1 14=2
.it#-*-Tf-C--EWin
1h
Master Theorem ( Deride and conquer recurrences )
Tcu ) = a Ten / b) tfcn ] ,
a ≥1
,
b > 1 if 6--1
for ] C- end ) ,
d ≥0 fcn ) ¢-0T nd )
it doesn't work

solution

!!
:

"
n={ %! ! ! !a=bd ⑦ ( ndlogn )

"
if

Data structures
Stach :( LIFO ) last in first but

-

Push ( new Element ) →•


Pop ( )
newetemeut

¥¥÷%•
0µg Oct )
- nu "
i. •←→•→

i i


,
I ☆ ↓
to

Pointer a-

↓ ↓

¥¥r -
the
↓ to
I
stack
↓ ↓

too
" ¥,

search is n pops comparing value : Ocn ) ,


RCI )

Queue :( First input first output )

tail
☐t→☒→☐→
front
enqueuecelemeut ) 011 )
"
Dequevec ) °"
pull "
1mn "

t.IT#---Fd-*7F---Dnj.i
front tail *
.fr#---.DI--Fd---FH tail

searching Ocn ) decueuiwg each of the n elements and


compare
them .
Graphs GLUE > : set of nodes and edges

Adsiacency Matrix
connection
to

undirected graph
00
00
Connection
from


undirected graphs
have symmetrical Adi Matrix
"

]
Directed graph
.a¥¥añ¥¥¥ to


.

no cycles
00
Adjacency list from

→É
node

connected to

only connection
node towards arrow
direction
?⃝
Graph traversal ( Process allnodesinagraph )

Depth first search

After visiting anode ,


visit its neighbors recursively

1
initialize
evirsitation →

→ init counter
2 6 ↳ start

)i¥¥•rE¥%ut
3
ALGORITHM dfscv
•*•
node
4
Togo to
Tbf neighbor hasn't been
neighbors
sited

increase counter ,

Depict graph traversal


nark node and
asastacih goto its neighbors
overtime ( repeat )


to
Fifo structured helps
¥É÷*¥

to
going baclhards when
aren't
4-
there neighbors ,

Current location

neighbors
having .

in the
top always
no

ccsobaanwaros) .

add
Other way to depict traversal
add
node
"
°%P◦p
go.ge#k
↓ →
Pop
ag ,
node
↓ → Pop →
Pop

☆ ☆
→ Pop

to ti_ tz 1-3 ty 1-5 1-6 1-7 £8 ta to til 2-12

if

neighbor if

neighbors
p ↳ first time
stack.add@eighboDstaaa.P op -

lastnode M it's visited
↳ ←
iteratively

DFS complexity -_ civil ) with ads matrix

( Nlt / EI ) with ads list


Breadth first search

visite all neighbors ina level in order , then go to the next level and soon .

6=2-0051
1 ALGORITHM BFSCG ) →

all
go to
{
neighbors z 3 4
→ init count .

{
90 to all
neighbors 5 6 7 8 9 10

agg.etoai.ee
of the "

"ɧɕ
"
neighbors .

"

I •
"
ALGORITHM bfscv , q
-

f- Front → → F%e¥Ñr

£
12
enqueue
neighbors

ifunvisitedmarienignbor

"÷"[
"" " ← " "" " " ""
" " "
"" "

7.
task
all
+ ◦ " " e-

when -1--1
neighbors
have
been
visited '

snapshotin for l°°P


remove front queue
node
6. t.TT?hb:FerwT
enqueved for

neighbor

=
[node, neighbor 's
, .
_

] .

front ↓
tail

*
[in
i
'

i.
ineighborr neighbors
,
. _ .
]
.
of
snapshot
levelsnodes after for loop
when visitedare .

are nodes
first)
(nearvisited

PushCneighborDWhenAI1nei!%[
ghborsm@isitedtj.pu
" sites
Topological sorting "
"" "
"

shia) ↳ =P"h " ) t}=Pu$hce C)


)
pop
tg=PoP(a) 1-7%84 ty=P◦p( e)

Push"
1- 5- Push
(f)
organize
that right
arrows '


+ we
*
)
" " "" "
tio=Popco )
" such
t , ,=PoPCb + ◦

go

Algorithm
-

Order Popp.e,F , C) a) d) b

reverse order : b) d) a) C) F , e

•*

b d %
C
f.

a •
Search Algorithms and Data Structures
Sequential Search ( Brute force )
value
searched

-

worst case : when there are no matching elements


+ £ ① n
)
worst
→ store in in the last
position


keep searching sequentially
until the value is found best cases they is in the first position
→ if found return the index
,

1- best ER (1)

Binary Search ( Decrease and Conquer by a constant )

searching in sorted arrays .

Ascending order
[° I 4 8]
A- → ,
, ,

Worst case : happens when key is not in


array
( worst (n ) (worst ( n /2)
= + 1 ( worst (1) =
I
,

=
1092 n + I

left mosta- → righi 1821 middle € ① log n )

i°%i%¥whf;¥¥
index "

posit or best case : happens when key is in the middle



interfere

i.
% ;¥¥¥ʰ than index
C best ( n) C- RCI )
→ if they is less
middle,
middle
value in the index
, with
most
rewrite right

if key
is
9¥ most the
with
Tower
middle

Trees → connected acyclic


in
graphs ( being asyclic manes 1- at AEI =
AVI -1 )
if not connected
it's called a forest .


A node with no children is called a leaf

All descendants of node and its form subtree rooted at that node
edges

a a
measured ,

counting

eʳdˢ§• depht
The of anode is the length from that node to the root .


The height of a tree is the
length of the longest Path from root to leaf


Ordered trees →
all children of each node are ordered (we assume it 's left to
right
,

0 Ordered

: node in left subtree
< is smaller than the
right subtree
< e

• B. i
nary -µ•
Ordered tree where each node has no more than two children : each child is designated
tree as either left child or right child

Complete tree Each label filled from left labels


to right all filled except for the
*
is
-

are
,

last one, it may or not be filled filled in this



order

Full Binary tree reach node
-

has either 0 or 2 children


_


height calculation 0cm asitvisitsallnodes

Binary Search Tree

Binary tree with the


following Properties :

keys on the left subtree are always less than the root , for every mode


keys on the right subtree are always more than the root , for every mode

Bstwithn nodes has


number
hight logan ≤ heighten -1
£
for agiven is+%F
this Value
if every
node sticker:#
of nodes
has exactly 2
(link
grates
,

↑weight -
Children .

level
stick : heighten)=1E1=n -
$
be # nodes
in
each
level
let
zʰ→mnam£Fer
m

full binary tree :


h=o,m=t=2° high" "
"

given
"* "
"

,µµ◦f^°°°
fora is the
"
h
2=2 i
h=1,m=
MM=[ 2=2+1 , →
this
i n

h=zm=4=É
-0

hen ) =lo9z( Mtd ) -

h=3,M=8=É fotfagivenf- nodes ,

lowest
height .

number

the
is
this

Search operation
If we are searching for 1h=4
Algorithm search ( root , 1h )
if root is null then found 4<8,90 left
return a- → not
if k== root value then
-

→ if the key is the


µ
.

root valve return it


return root ,
473
if KK root value then
.
→ if they is less than root value 90 right
,

search ( root Left ,k ) search left subtree


.
.

if K > root value then → if they is more than root valve 4<6
go left
.

search ( root ,h ) right


search right subtree
.

"ʳᵈ '

µ
Complexity : Ocheight ) depends morebala
,+vgneb
- •
on the
4=4

topology
✓ ,
return

WworstEQn) if Wbestcrclogn )
stick if Full binary tree
for random data WE
login )
Insertion 4 to be inserted
position
"
is

current correct "


"
4hr8

,eenp◦undaW
Algorithm insert ( root > newnode ) the " '
goieft
µ
" Fᵗ°°ᵗisnuH then

nogeisa
if
vote "

iflesstha 473
.

root new Node ya


,

""
__

new " 90
if new Node .
value < root .
value then _
• ther
+

tneteft right
,,
insert / root left ,newNode ) "
csoleft
ifmretwo
.

°
null

value herod
therictht
Left
__

if value root then


'

new Node ≥ .

+ "¥ "
left
.

+◦ rewrite
insert (root .

right,newNode ) "ᵗᵈ '

morebala
y
complexity : Ocheight ) depends
trendsetter
- •
on the
topology
← →
WworstEQn) if Wbestcrclogn )
stick if Full binary tree
for random data WE
login )

Deletion

Dotation is different from


searching and
inserting because it may affect the structure of the tree :

First ,
the node is searched as before

ifnodetobedeletedisaleaf.croot.ler-t.nu " ) if node to be deleted has a single child


i →

nut -0B£
-

\
'
-

ifnodetobedeletedhastwo children

nastobe
node here" "
> tan
of 6 Complexity : conserved
if structure has
even to be
left
children
achieves
that
ta
with some transformations
this is done inconstant time ,

↑ so
complexity is still Och )
i. i'÷ .
Binary Tree Traversals

Preorder right

{
: root ,
left subtree subtree
,

Inorder : left subtree root ,


, right subtree Dfsliine
left
Postorder
( BRS )
: subtree,
right subtree root
,

level order : visits nodes level by level starting from the root

tri¥¥s¥¥¥¥¥node
-

I
're
6
"ueF° µ
8
2 2

30 "0⑤① ¥47 9

5006 3
5

I
I 2 3 4 5 6 7- 8 9
2 3 4 5 6 7 8 9

leftmost g
go

:,¥÷s
to
tries 8

are "

-;¥fE%% no 7

2 3

level order ( similar to BFS )


?⃝
AVL trees ( aiming to balance the Binary Search Tree to get closer to Oceogn > complexity )
( self balancing trees ]
-

If the Balance factor is defined as :

balance factor -_
height cleft subtree ) heightcrightsubtree ) -

An AVL balance factor of


is a
Binary search tree with a -40,1 For
every
subtree

Satisfies : ( children 2)
Binary tree condition ≤

Binary Search Tree condition (left children ≤ root ≤


right children ltneiree )
/ Balance factor / ≤ 1

Bts but
/ ⑤
A
"✗ 0

◦ ±


O o O 1 1 00 0

0 0 I I 1 0 0 I I
bf=O
- -
-

0
É O O O O o 0

not
matter )
competent
does
(but

search and insertion is equal toa BST unless insertion makes the AVL unbalanced ,

with balance factor balance , local transformations have to be


of zorz to
regain
-

Performed ,( rotations / are always done at the unbalanced node closest to the new inserted
leaf

1 closest to
Right Rotation
1
leaf •

unbalanced
subtree
the

3 3
is added


inserted
'

if 'z

]
'

I' is to
( goestotheleftst ) goes to
"

it

Oz
°

right

- the
1

I
f R

Left Rotation L middle


value
-1

%
I Iis inserted I
the
egoesto

(
be

/
' '
is to

É•
z
added .it
go •☒*
'
'

3 Goes to the •

)
'

y left l ↓ to

mµauag,,gµ,µ
greatest
,
insertion to
• the
right
mainesthetree *
vmaiue Value
unbalanced left rotate ! ,
R

insertion to
the left makes the tree

Rotations have to Pointers in away that still holds conditions )


reorganize STS

rootor-theunbsilosec.to??%FF@T-Tz to the left


r< 1-3
CET, CSD
r > c) 9

"f→①
" leave to the
less
to the

( wig, become
left what is ② C<9sr
I
thane
,
0ᵗA
a- → ,

! a
>§¥Ñ it


e ◦ .

Soto
BB@BBzza.w
✓•
,,
make that
,
,wg
possible
, , , ,
, , , , ,
gµqg g- Catz
, , , , , ,
, ,g
Rotation generated it keys ÉBi5ÉAIÉÉ
Examples : • what AVL would be

were introduced in
sequential order ? ,A<B< G. - -

° ②

±

-

f
insert
insert (5) C
④ 1 LR
B-
'
I
insert (6)
(8) =D
.B fᵗ
④Ñ
⑦ c° B<Ctf⊕A° 0


E

0

insert # ↓ •
resulting AVL after

inserting 13

ng
10212<15 0

12
RL rotation
coexist
I
0
"
> 10 15
*

V0
is 19°
3<5056
riot

85 - • goes left

AVL
complexity analysis
insertion ,
deletion search takes the
, ,
same time as a normal BTS,
however ,
because of rotations topology is closer to
,
a Full binary
shape ( balanced )

Complexity is still Ocb but now :

r
not equal .

First : lozn ≤ 1.4405109 > Lutz ) -1.3277 tf n >O


this implies tha for a
given ᵗ
number of
limit subtract
' '

n
"

matin
hmi-dogznhma.FI#-ogn+zI.ig-

Complexity

:

← ,
minimum
◦ "' → " "

For all operations


h
m÷;ygEpfY
n ≤ 2 → nÉzh
h <
1.44051092 Cut 2) -1.3277
"
Feint
zʰ¥?¥→ "
"

nÉ÷; 2)
"

"


-
+1

limit
2- 3 Trees )
( Balance a BTS allowing more that
key in each node

Can have two types of nodes ,


2- nodes and 3- nodes
↓ .
two ordered
Contain
All leaves of 2-3 tree are single children they keys me ,kz(ke<
" 2)
and 2
and has > children < " ±
.

in the same level ( ciasical Bts )


keys
left subtree keys > 192
.


subtree
right
%
balance keys kr_≤k≤
Perfectly high subtree

) m.gg, ,
( depth of every leaf is the same •

searching Foran element depends ifanodeisz node or 3- node if it's 2- node , on -

Just
compare the key and go
left or ( similar to a BTS ) if 3- node
right ,

decide if
right
left or middle subtrees , .

Insertion is always made in a leaf except ,


when it is
empty ,
Place is found Performing
a search for the they of the new node .
If 2- node
,
insert Just as BTS .

If leaf is 3- node
,
split leaf in two ,
smallest key is put on the left , largest on the
right and the middle becomes

Parent .(Promoted)

ggggztoleafz-nodez.ve
"
insert
aYs¥• when
into

{
!
0
" ¥¥¥ Promoting
makes
3-
node • its
inserted
"
"
"
use

+""
a

[
middle become ""
parent 5
0 it's
<

y 2<3 3<548 9>8


adds 2

:[
node ,

{ split 4-
promotes
because → and split →

of
all
"

Depth is
→ years
¥¥!¥¥¥
-

Example :
r - r - r
/

3
1 4- node / split 10 10 10 2,10 2,10
→ 1
4
10, 15
→ →
10 → → 1, 10,15 → 1 15

1,2 15 1,2 , } 15 1 3 15 1 3 12,15
2

node , 4- node
4- ,

spqlit split
10 10
310 2,10 , 15
2,10
→ → 15
1/31 15 2
!

2 →

1 3 12,15
1 3 12
,
/ 5,17
+
,
, ⑤
I 3 1 3 12 17,19
4- node, 12 17
split
Complexity 2-3 trees : Also depends
'

'
on

n'
tree's
height :

A- 2-3 tree with the smallest number of keys is full of 2- nodes


worst
case of h : ( in this case it's a full binary tree )

rich ) ≥zʰᵗᵗ_ I → h≤ logzcnt 1) -

I
w
↓ case where
" °O° '

maximum for
height
'
minimum are 2-
nforagiverih all nodes
'
n' keys

to
keys a given

best case of : IF all nodes are 3- nodes


h
ht1
n≤ 3 -1 → h ≥%gCnt1 ) -1

↓ ↓
maximum minimumfor
)F◦* '

agivenheightt agiuenn.de
height
honeys "
)

opyneysl
"

↓ ↓
best
case
worst
case

Complexity 01h )
t-CRclogscmtyteo-cl.sn
-_ →
)

Oclogzcnt )
1- C-

Priority Queues ( store information Based on


priority ) ↓
↓ left

"[.ii"
"

"¥¥ (element Priority )


from
filled
•areF¥&¥¥ᵈ¥
" "

right ,a " ,

implementation
+◦
Comun orvtoalueistaiaen
for priority
made using
Heaps
as
is
Complete binary tree which satisfies
.

max →
≥ir- minheap
↓ the heap condition →
Priority cchildrenl ≤ Priority ( Parents )V- nodes
6
Completeness of the tree this implies that the root
is the maximal element
agootfittouse
.

makes it

Arrays

Parent
""
I children
>
✓ i complete
:÷¥ -
a. not

- - t

Canbeexpresedas]
- .


: -

" " ""


¥ "

-
Agi]≤
ltfiutlil

II ☐ treemostfouowi-%1e.ve/- order
condition

no

/ eftto right
☐ × ≥t,O
T ≥ G. so ≥mN # Heap
the root "" order as in BTS
•É8• eacnciiiorenaretocset
after ,

, I

1 1

I I•• ! i
parental nodes '_=5 V2 pin
-42

pH 6µF
zit
leaves
Constructing the
heap from a list of keys
item withe ( bottom-UP construction)


maximum priority heap

Initializes heap list of keys and


-

the array with


-

→ Put element with


then
maximum priority
in the first position again heapifies .
:

1092Mt )
' -
' ≈
↳ due to its
completeness

oftreethathasnnodesmI
only reshape
there is ◦ •

starting with
Checks if
the last parental node ,i=%

heap condition is met ( Hank] ≥H[n] )


-
ifnotmet, exchange nodes with the children with
only located left-half
Parents
largest
are in the

OF the array (11-0412) key


Check if heap condition is satisfied inthatno.de

( continue until heap condition ismet )


node i -1

repeat


90 to the Predecessor node and
until node is therootoftuetree
working d- I
.

node "
working
%
_••¥%;µny•
"

same order
oftneiist
-

parent .ca
parent > children
%g
node
Heap
not
condition
met
them)
(noneedto
switch )

(switch
Parent < children
( switch
with •
Heap condition

✓ :¥÷i÷¥:
)
highest
"
with met
( switch

switch
complexity ; →
anparents
OF
heap Cwo ,s+= 21h -
dog >
( ntt ) ) E. 0Th )
creation

parental of Cbest C- 0717 -



heap already
visit all the
"
until
heapified

nodes

current
parent
-

↳ig{i¥ent
go.em-qstillir.tw "
" insertion of element in a
Heap
array
with "

nlgµµ+pweᵗ
gngi%%ᵗs strategy : insert
.ir/-helastpositi6n,nodec1imbsuP
µÉÉE¥j¥
child
of first
position

"
insert 10

;¥É¥
"
"
the " parents < children
③ } * tvecni
"
, ,
priority cswap )

in _

highest
1

¢
Priority
/
① →p;¥eoiñ¥%- st '
priority of
maxchildd
OF the
current → heap condition is true
parent Payed;¥aFYreu
elements

moving down requires


Complexity : comparisons made to climb up are
of insertion
• two
comparisons
Cfiwg larger child determine ,
limited to the

height .

if exchange is required )
① (b) → as
h≈loqn →
Oclogn )

best case is Rcc )


which when last
happens
-
Position is the correct position
Deletion ◦ resection ( of the root )

Erect instead the


root and Put last value ,
sift down
comparing for heap condition
until heap condition is met .

Complexity :

war
comparisons are limited to
heap height

a ;&e%ᵈsᵗ
node
1- C- 01h )=Qlogn )

rent< children
( swap )

'

. _

-
-
Y
.

••

BUK parents
Children
'
%f^¥f¥f

Ejection ntimes

es-ec.tn/-imesis-0n1ogn)Es-ectingnt-imescreatesasortec
if erection is made in Ocloosn ) ,

( array
discussion in sorting section )

search -
• has to go over every element , no spatial feature here

Complexity : Ocn )

Example of constructing Heap from list of elements .

result :{98,474,530,31 ]

Finish !
0 0 0

↳ 0 a
Hcnot
I → I I met d- 6
2 2
→ 2MswAp) Hemet 8 6

3 4 5 6
not
HC
met
*3 a 5 6
He
8 a 5 6
! →
8 a 5 2
7 4 5 2

↑ "

qµwiM
" WAP met
Hemet
7
ypg
7 8 4 7 3 4 3 4
0 3 1
node pg
Hcnotmet
( SWAP )

Hcnotmet
§
↳ O
O O (SWA )0
? 9
a

¥¥T¥FI 6
µcmet→
9 6 " ° 6

z→
9 6
→ →
OHC 8 6
%¥waP ) →

8 g
8 a 5 2
,
∅ " ° ≥
8 4 5 2
8 # 5

nÉne)
2
4 5 2
°Ñ¥w¥tp
µ ,

7- 3 4
7 3 4
7 3 1 Het 7 3 I Het 7 3 1
,

{ ( swap ]
7131
Hashing (efficient way to implement dictionaries )
Idea :

keys are distributed using an


array 11-20 ,
_
_

,
u -1 ] Called a hash table

distribution of keys is not , → sequential if hash function assigns the same
address to different keys, a
collision
a function h indicates de index of each
key .

Occurs
↓ hclhj )
.

heh ;) V- I -1-5
integer
=

an
hash function assignskey [ 0 , m -1]
_ •

called
. _

to ,
.

a
huh )
address
.

key tells
where is '

this is
wash
key the +we
the in
stored
array
.

Good hash table size and hash functions reduces risk of collision
( still a edition resolution mechanism is implemented )

method depends on
used
hashing technique .

2- From 0cg to 0in )

hash function
↳ a

is
bounded
not estocastic
behaviour
.

Open hashing ( separate chaining )


Representation done using linked list
;¥eÉÉñPñ%Éʰ¥¥¥%
"
a

address hired
µ ¥ random
hunt if we have ↑
hah ) = 1h % M
,
M -23
-

they
-

Collisions

if they 97 is searched , ʰ if they = 42 ,


42%23=19 ,

index is calculated (97%23)=5 they is


compared to all elements
in thelist with that address
and valve stored is compared to
[ 19,663] ☐ hashing function
they searched It (5) 97 if true ,
-

= =
is called and 2

return HC 5) ka function comparisons is


are made ,

only a hash collision search


µ coup
"
{É go if
,
"

made in 0in of elements


ang ) where
done
were number
is the search
list
the
unsuccessful
the
.

in all
( for )
elements
looks
items
stored
load factor = ✗ =

Mm

total size
# Probes successful search

≈ It ✗ 12

hash Averageof
inspected

of the number
table pointers
# Probes unsuccessful search ≈✗

Complexity : Ok )=0( Mm )
①( n /M ) →
Be
reduces comparisons
However, if
by factor of
large
a M
Am = 1 achieve
with open addressing close hashing )
( in contrast
we
with respect sequential
-

is to
for example keeping
a simple linked← ✗ close +01 search .

list and search

:L would be sequential Complexity :

iⁿee¥¥
.

"

and & OCI )
tinned

Open addressing ( closed hashing ] Methods ( for these methods ✗≤ 1)

linear probing
All keys are stored in the hash table without the use of linked lists -

Collision handling : if collision , key with the same ha, is located in the next empty position ,
if reaches the end of the table goes

to the start
for #arch is done ,

the same

19
hah
# huh)

;¥¥÷¥÷i÷¥;÷¥¥ !
.

'

.

20

:¥¥÷%¥d%É¥
:*,
"
"
¥ "
"

hah"
"
getɰ

✗ = n
/m if m → 00 ✗ =D
,

}
# SSP I to be inept as


__
✗ has
to ◦ as Possible

# successful
search probes =
¥ +
# Usp = ,
closest
or Enlarge

if m → n ✗ → 1

z✗)z miserable
,

# Unsuccessful search Probes


{ + # Ssp →
}
,
= 00 worst case

# ugp → • don't let ✗ get


closeting
beyond
Problem of deletion : if original when collision
happens is deleted 0.9
key a
,

no way to find the


key with duplicate address (asyimbol can be a substitute )

Problem of clustering : clusters inside the thebe separated with empty rows and created this affect efficiency
,
Solving the clustering problem with Double hashing

Uses a second hash function to determine an offset in order to find a free cell

hah ) is calculated '


large prime
for example
,

if collision , calculate hah ) + sch) where sch ) -411-14)%p


if collision , calculate hah ) 1- 25cm continue until finding
,
an
empty spot
Problem of fulfilling the table :

when ✗ gets close to 1 ,n→m , Performance deteriorates dramatically .

Solution is rehashing , where the table is relocated to a


larger table
G- ipically twice current
size )

Examples :

0 I 2 3 4 5 6 7 8 9 10
" .

4477100 92 306354 52

44%11=0
77%11=0
* es +01 52%11=8
30-1-11=8
92%11=4
try 8T (7-52%7)
~
100%11=-1
Esto
54%11=10
2
3Wh
Fte
63%11=8 12 → because ◦
array ,
circular
go '°esto9 is index I
this

try 81-217-52%77=16

index is
5
Sorting Algorithms
A sorting algorithm is called stable if it preserves the relative of
two equal elements ( order from input array )

if an input list contains two equal elements in
position Is where i as
, they would be naped
to 555 where i' < J
'

positions
'

Generally algorithms that


, exchange keys that
can are far apart are not stable ,
however ,
they are usually faster .

An algorithm is said to be in -

place if it does not require extra memory ,


maybe a couple of units .

Selection Sort ( Brute force / decrease and Conquer ) Complexity :


SWAP F- 0
M
-
Z n -
I
min

{ {
n(h-
' =

can > E
_

5=51-1
my
5=0
5=1 min ↓
number
of comparisons
F- 2 min

not the last


0Th )
→ one

swaps ( n ) = n -

I C-
current last position minimum i :3 min
array

position → the it

}
the
Finds of
and
saves
Value i

Algorithm
mi "
after i= "
is in place
Min
-

to

and not stable
swaps current is min

Position with
minimum found swaps
alter relative
can

after current order .

Position
(
.

if maximum value

Idea : Find element repetend at the


put it
is
minimum of the
right sub Array and

to the left strapins beginning ,


order
"
direct would be changed)

Bible sort ( Brute force )


Complexity
i. 0 5=1 i=2
:

Z I
U n -2
-
-

Ccn , { [I =
ncn 1)
-


doesn't to last index
5=0 5=0
2-
so
" "
iteration

f¥¥¥%¥÷ .in
€ my

last
-
sends maximum
one is
not swaps depends on the input
compared
value of the left side array A- [ 0 ,
. .

,n
-
z -
i]

sends value because it's
if A- [51-1] < A- [J ] V-J < n
,

right
@this condition
to the already in
as
right the line
swapping with neighor as it can
the correct position ways runs
swapping
if current element is

means
grater
descending order
,

Idea : take maximum element if it is in ascending order


of each already, no swaps are made
array
sub in the left ,

swaps c- Ocnz )
Put it to the

right
sweeping
ER (1)
iteratively
with its
neighbors Algorithm is in -

Place
is stable
Insertion Sort ( Decrease and Conquer )
Idea : shift left remaining array and insert
current value where it is grafter than the element outkleft
A→→Piece already sorted
[ ✗ y z
] 5=2

↑ V=Z
i=ᵗ

5=1
hear " "
- ☐
starts in second ↳
goes to where
i t is more
start #
y > 2- ? shifting # than left neighbor or

Position remaining
→ current
element
+ rue
elements already grater than left neighbor
bei-ore.eu;ñ¥ [

index × Y Y]

, , ✓ =z - •

J=° no comparisons
fake _
with left

µj¥%%ÉÉ
> ✓ = ≥
before × side
false true
shifting → large shift avoided
[ ×, × Y] in shell sort
" )
[ × ,z, y ] ,

www.is8°
when #°

,pee~°°ⁿᵈ,ᵗ¥¥f(s
-
± 5=-1
"
[ 7.x y ] Y > 7
greet
,

× > 2-
g. to

( or
no
noplacevote
Complexity : number of comparisons A[spv in the while
,


depends on the input
A- [ ] A- [ ] I > i

worst case
happens when is in decreasing order
array
n-1 i -1
>
(worst =
[[ 1-
=nln-÷ C- 0cm )
5=1 5=0

best case
happens when array is already sorted
( never enters the Wile loop ) assist

,≈¥→TÉd¥¥É%
"
Cbest = n -

I C- Ocn )
↳ Good for almost
for random data ,
Performs em
sorted arrays
( close to worst case )

the algorithm is in -

Place and stable

shell sort

Insertion sort works well for almost sorted arrays ( almost linear time ) .

30 idea is almost sort the array and then insertion sort


,
apply
to
in this case large shiftings ÷:
from
right reduced
,aterÉ%
'

to left are

,
almost list
'

To sort the array ,


'

is grouped in lists with lhelemeuts E. 914=4


,
.

each index 0 is
compared in all lists and minimum is saved to the array ,
this for all indexes <
4
if there elements left repeat the ◦ bore

ifeng.%ea.ae#i--=
are .

sorting
iieiements 4-
z - sorting

☆ -

¥fsts

then remaining
,

i=o aging
values

f. *
in
5=2
i
-3 0min .


the minimum of one list is selected and
added to the array

same is done with the other array
and
repeat until all elements in the array
Complexity of shell sort

" A
""
• by almost
from ① n' Ito Ocnlogn )
~0Lnl°9
-

case sorting , goes


in average case .

→-
yes
large best
" "

theoricalhl :
)
-

ii.
when
is
already
not stable complexity
ggycnlogn array
sorted .

)

C- Out

Merge Sort ( Device and conquer bya Factor ) second


first
half half whole
?¥£%^ array
ipnisnot
COMP
2
,
↑ ↑
of
↳ arises
'


Adcuhietvedecursiue
calls

index a- i¥¥eFme¥
☐ →
.
stillarray
index
first half

in

→ •
increase

with
→ second
half Bhatt
nottaneii.EE#-EaiTa.:+o
}add+•
{ copied
→ run
continues right-half to fill the

}l%t¥¥!!¥ra


splitting the → run
with the array in the
"" + Position #
array until otnernaef " "
.

e
copy a- of ◦"
n=Z
→ merge results
inan ordered
remaining
elements
fc
elements remain
"
""

↳ % array

copy there
"

when n=z merge now + ◦

gamey
,
returns[minimax ] copied
( is remaining arson
and start going copy
backwards B.
returning sorted
of halves
arrays
.

whole
Complexity :
[(1) =D
n
Array
Can )=ZC( nlzjtcmergecn )
% V2

(
splits
( merge 8-
while iterates
__
n - →

%, 44
"
14 worst overall the
My
array
EB

Mlb Mlb Mlb 4/16


that spitted
now
cworst-ZC.cn/z)th-Punti1n--2
backwards
,
"
"

Ndogzcn )
"
sort small NTI
resulting
= -

merge
instances merge merge goes
atatine Hmmm '
"
merging C- ① ( nlogn )
B :[ 3,8] 0=[2/9] 9°
← merge backwards
5--0,5=0
mines , > 3=2
merge
Coverage € nlogn )
A- [ 23,5=1 [ best

)
=3 •
min / 3,9 )
a- [ 2,3] ,i=i_
merge
Algorithm is not place
in -

min / 8,91=8,5--2
A- - [2/38] elements Stable
remaining
copy
CTOA ]
of
=[
338,9
A-
Quick Sort ( Device and conquer )

Idea : for one element inasortedarraylp ) , everything to the left ≤ Pand


right ≥P , if we Find
sorted
the Position of the array where
right ≤ P≤ left that's the correct ,

Position in the array, we can find this value for all elements recursively
for each sub array
, right and left (without Pivot )

"
""

Pivot
,en++w,◦◦p,•fif&i¥ÉÉfiwdFᵈᵈʰ
Host eleven

1s¥ÑeM
position a- isfirstnl-GHfqgq.BA
deme
at the
start ,

OF the
pivot →

tefft
niuithout
☐◦
+
Pitt !

Agg
]≤
P


right Portion

i → swaps

right Pivot ITti%¥ett<Pi


"

qqggiggqqwhiieiandsleksthanPt.tw
"
eeftandgratertotheright amiss
:^*
"
"
i goes ""
until element
>
naueuitmet is " '

A there
pg¥£'o
e±%£ :
swag :p
"

/


/
0 0 the
swap
move
Bi
returns pivot
$

Pivpotsition
.

of J decreases
→ ←
swap
j increases
d
÷¥÷É¥¥
g-

.
• .

SWAB ,É◦Fᵈ

0 .

left
right
¥ ;#
_

portion ↓
Iq portion
elements
Pivot -0 refit swap
stop
pivot
merge
already that left ≤ P
intneright
swap iteration so


position right ≥P
located
#P .
↳ ivotis place
its
in

left ← 0 I
to
portion ↓ to

rejig:O FeʳghF¥iF%n ↓
of
Position is
pivot
0 Complexity :
the
returned
to
returns pivot
value

(worst
=(n+1){n
returns
value
→ -60cm )
( best C- (nlocsn ) more
only 39T .

Q
return
refute Cavs ≈ 1.39 nlogzn C- cnlogn )
gave
%Ya% runs faster than margesortaudlteapsort
I 2 3 4 5 7 89 On randomly ordered arrays of
↓ nontrivial sizes
:b high
.

lo and
Order
( make reference iandJ Pivot selection makes Quiciasort more efficient
array maintain
to the
original
that
"◦
+ Algorithm is not

is in
stable
Place
)
-

so
array , there
is
located
Heap sort
1) construct the heap from list of elements Ocn ) (let them reorganize )
2) apply root deletion n -

i times .
Ocnlogn )

list
→ eject
root Complexity :

Te¥weuts→ ②cnlosn )

.zaf
→ reorganizes 1- C-
↳ both ,
heap worst
' and average
organ


Algorithm is in -

Place
not stable
away
sorted

Sorting by Counting (space-time tradeoffs ?

Creates of count values counter is


an
array .

array of indexes n the index of


, , 1
↓ the new array

[3,1 0,1
,

if Array =
[ 62,31 84,96 , 19,47]
, , 4 , g- ,

[62×31×84,1964 19,147 ]
Count = [ 0,0 ,
O
,
O
,
O
,
O ] ↓
Array
reorganize
to
start iterating :
ordered
[ 1931,47 6384,96 ] →


, ,

5=0 31
, 84,96 , 19,47] index o ± a 34 5 Array

count how many


minimum number of swaps
elements to

the
right are

[3) ◦
I 1
,
1
,
°
,
0 ] if we perform counting for all elements at once,
grater or
linear time
lower
.

I#
.

④⑤
their
ii. grater
84,96 , 19,47]
,

Counter increases
if lower , current
counter increases
[3,1 , z
, z ,
0
,
I ]

i=2[④⑤ , 96,19 , 47]


[3) Is 4 , 3 ,
°
,
I ]
5=3
④⑦ ④ 19,47]
[3) Is 4 , 5 ,
0
,
I ]

i=4[④⑤ , ④④ 47]

[3) Is 4 , 5 ,
°
, 2 ]
?⃝
Comparison Sort Algorithms
Algorithm / Idea behind / complexity / Best for / stable .IN -

Place
/ take gratest value of the comparisons
Bouble sort left
subarrayco.mil/-othen--DC-0cnz ) stable alternative in -

Place

( Brute force ) right swapping with neighbors swaps to selection sort .

eah time 5€ 01h2 ) -


☐ desc
order
ifarrayis almost sorted stable
SERCI )→8%er SWAPS are decreased

select the minimum element comparisons in Place-

and not stable but


selection sort oftherightsubarraylitri.nl ncn¥ C- cuz ,
is
worst-case

swaps linear
Putitontheleftci ) swaps not stable
n -

I C- Cn )
Shift to the right the worst c- 0cm
Case
=
n(n Place
Insertion sort position of
array until carrayindescorder ) Almost sorted arrays ,
in -

the element is found best


case
-

-
n -
_
' c- 0th achieving near linear
(array already sorted )
time stable
C- cn2 )
A- V9
case

# ( Avg case twice as fast
Crandon data ) asbovble and selection sort)

Almost sort the array wstcase -_


0¥ )
béÑ%¥s%=0ueogn ) gain Ocn
""

shell sort and then apply insertion ) from Ooi ) in _


place
Otulogn)≈Qn"9
sort Avgcase
losing stability
__

not stable
split array down and sort

Merge sort small sub instances and ( nlogn ) sacrifices being in -

Place not in -

place
For having worst, best,
merge them avg

complexities nlogn ) stable

Find the position of an element worst


-( ( vid )
Quick sort where all elements before are
best C- cneogn ) fastest on random data in -
Place
less and allonrightaremore £39T more
lose stability gaining speed
-

,
avg C- ( ulogn )
this for all elements smiting not stable
recursively
that
create from list like mergesortinthat in place
a
heap
-

worst , best , avg case C- Ctcnlogn )


Heap sort of elements audesectall cnlogu ) not stable
but in _

place
elements
count elements that Performs minimum
o-É¥:::::::
are less f. ◦
masmaiiset

to find their out number of swaps not in Place


sorting by and more -

masses
.
,

Counting indexes the sorted if range is


a small discrete
stable
in array
n¥Ecnz , works in linear time
Dynamic programming
Solve big problem solving sub instances
a of overlapping sub problems storing each

subproblem's solution to use later .

Optimization Problems in dynamic programming assume optimality principle that states that
an optimal solution for problem is composed of optimal solutions to its subproblem
a

coin -

row
problem
finding the coin that maximizes the total sum no coins taken can be adjacent

Eg : C =
[ 5,1 2,196,2]
,
we can either take the coin or not , if we take it
,

we need to get rid of the value before , we if


dont take it, commutative value remains the same
recursive formula :
as
Fcn ) =
mail.cn/-Fcn-z),Fcn-es )
↓ F- ( O ) = 0
,
f- (1) =
C1
solution
depends calculation is sot made Bottom UP
f- [ 6 and reused
solving approach
on calculation → ] once
of past
( stored )
sub instances

19--7
¥Y%÷g
f-[2]
Fes ]

# already
a/ f- [3] f- [4] f- [3]

"" known
%¥g /
→*
In A
f- [0] FEI] f- [I] f- [2] f- [27 FEI FEI] FEZ] → already
known
~ to
base
base case
cases
Dynamic programming for
solving graphs problems
War shall's algorithm ( find Transitive closure of a
graph )
Transitive closure is a matrix of size Nix / v1 where it is 1 if there is a
path of
any length from row node

to column node
matrix that
Ads Transitive
path
.

Graph
rs
:b is a)

Oᵗa%%:¥
closure
a
there able
from

from connects
node
c¥&h
(
from from that
nopath
connects nocages

.

other
to

1h ) 0<-19 ≤ n

Rci ,J ) represents if there is path from node i to nodes Casa e-


nodes
using < In
→ or
a

adjacency
Above means that R'%;D =
A , ;, matrix
(n )

and Rei ,s , tells if anode is reachable '


n' nodes,
using
so this is the solution (the transitive closure )
"' →
R'
"

Idea is to find R' from

Recurrence relation :
matrix
adj
(O ) →

R[ if] = A- [ i ,J ]
( h) ( 1h -1 ) ( 1h t)
[ µ -1 )
-

R[ =
REI ,J ] OR ( R[ and
RENT ] )
i ,J ]
in]

there is

/
a

there is that 14
connection or
a
and
i 1-05 Connection from to J
From
already i to some 14

(h)

other way to
(14-1)
write
¥1M sum
(14-1)

R[ is, =
REIJI + Rsi ,
is]

Rugs ]

0hr9
iterateeach node
} offer init ideal for
dense graphs
not the best for sparse
(better DFS )
matrix
Example
adj
:
Ñ[i ,
,]
=
AÉÉ ]
( M) ( 1h 1)

R' [it's ] )
-

"

Reiss ] =
ÑÉ¥ ] OR (R[ in] and

Is

"'
R%i% noawoswitn

r
- - - -

•;⑧ •
R' D= p'
2)
= p'
3)
=

÷;
••÷→•• B. &•
wniwation
↓ •

R%n aµsÑ
?
% it
of

transitive
Be -qg → closure

"=•↑÷•;¥.?÷÷÷*•÷÷;
&
i
•.

Floyd 's Algorithm ( All -

Pairs Shortest Paths )


Calculates the minimum distance matrix of a
weighted graph ,
if there is no connection set to infinity ↓
if it
is

unweighted
"
( may have negative weights but not negative cycles ,
infinite loop ) area
"
"

weight ,

Approach is the same as war shall's

matrix
weighted
not
(vis )
W[i,g]→
( O) if "

www.onbetwee
wher
D =
graph =

EIJI i ,J
]

J ,W[
and
recursive relation
-

: j

( in)
( a-
I ] H th 1)

D[
un

)
- -

i ,J] = min
(Dci ,s ] ,
D
[ ish]
+ %k,J]

to ↓
current minimum
and then
that
mij.fm?.a0r
to go
from from
value i + ◦ some node µ to

intermediate
nodes
.
nodelh J
Example :

( in)
( a-
I ] an A 1" "

D[ i ,J] )
- -

W[i,D=
= min
(Dci ,J ] , Rina ]
+ DEMI ]

BE BE
(1) (2) (3)

D[is ] ⑨ D[is ] __

D[is ]=

•Y
-_
-0 Book •o o•••

BE BEE EE BEE

rewrites with

mink?e'%¥
ctowmn

*qg_pnpgfÉ¥¥ʰᵈ
B- BE
' 3) 3)
• •
D[Is ]= •o *E@
[Is ]= •.

Bob -⑧ y Bob --Z

↳abetter
finds

path

Greedy Algorithms
Only used for optimization purpose .
Tries to calculate
global optimal besedon local optimal values .

If optimality principle is true , global optimal is accomplished ,


otherwise ,
an
approximation is given .

Choices
made in each step must be :

-
Prim's Algorithm / Finding the minimum spanning Tree )
Spanning trees are the graph's combinations with all nodes but without cycles (tree )


-

starts onenodeaud unveils its neighbors


with ,

current unveiled neighbors are the elements


of the priority queue , when we go to the element
with most priority ( least values itisesectedandaddedto ,

the spanning
'

tree


• •
3. a
-2A →
r b "

inf.nu

⑥_⑥ Ñ
⑥ "
isapriori
☆ qg¥¥F%¥É
µ ,

3⑨--@
5'd

Eba 6) C "
" ""
"

also q%¥¥ neighbors


of a
.
0
◦→%÷¥¥¥at tree

¥8 e)
← " •

µ ↓
""
"
Could've gone
"

minimum
0
toe
0
§ least value is

⑥ all nodes in the


tree > finish
the element
most priority
,
with

"
itisaooedtothl
becomes
;y¥%anÑ•Ñ
6) a .

3f
and
tree
current "
(it's

Complexity ! 07nF )
C) (
-

IE / log / V1 )
Priority queueisanarray
-

Priority queue isanninheap
ad ist .

I
Dijkstra's Algorithm ( shortest path )

Finds shorter path from agivennode to all remaining nodes
similar algorithm but in cumulative ( also using queues )
Uses
concept to Prim's min
priority

a a way
.

iii ¥8
Order of visitation :
• •

¥
•→
µz\0 ⑧
a) d) C) b) Gf

3) d Ed 6 ,d

5) C

D-
5) A


Complexity
§ÉQ
! 04nF) -

Priority queue isanarray

@ OCIE / loom ) -

Priority queue isaminheap
12 )E ad ist .


8) A
order -_
A) C) BE ;D , F) 6,1T

Huffman Encoding For data compression


if we have the following sequence of characters
0Ffeused
prefix
representation
no
count
manpower
.

we can how many times it appears


to

and replace

each character

fora
given
Patera

* depicted Year.NL?:?gf-goesieft
- representation
this

÷Position
'
the is
be
can if
where
÷* at level .int
.

tree
level intine
:
- .
_ -

- ✗
First Position

Q% 1*994 . .x
"%¥
%
.
0000 →
go left 4 times
11 →
go right 2 times

:
BAGGED

Huffman algorithm
from a list of characters and frequencies !

Huffman tree is formed taking the two least values,


forming subtree where min valve is on the left and max value
a on the
right
this is repeated until we have the Huffman tree .

two least values


least this
from

¥÷ options

↳:¥¥ :{ree

b%ding
tree) code

Huffman (trie b¥%u


used decode
From
→ tree
and
G- 00
D= 01
A = 11
B =
100
-
= I 01
String Matching
Brute Force approach :

goes with the Pattern comparing . Complexity : Performs men -


m + 1)
comparisons → Ocnm )
in worst case
the
( comparing almost all
pattern)

in random texts it has shown to be 0cm

Hors pool 's Algorithm

table of shifs
Uses a
given each character found at the end of the comparison
(tries to maximize possible shifting )
complexity : worst case = Qnm ,
if random text = 0Th )

Because its found


Formatchit


000000
Kasi# Yep

00
0

-4ps ◦

You might also like