BNetwork Presentation
BNetwork Presentation
BNetwork Presentation
Pradesh
Bayesian Networks : A Tutorial
Presented By:
• Anuja Sharma(20MCA301)
• Pinku Borah(20MCA304)
• Tasso Tatho(20MCA312)
• Arup Basumatary(21MCA205)
• Kago Moda(21MCA210)
1
INTRODUCTION
Definition:
We can define a Bayesian network as : “A
Bayesian network is a probabilistic graphical
model which represents a set of variables and
their conditional dependencies using a directed
a cyclic graph.”
• It is also called a Bayes Network,belief
network,decision network or Bayesian
model.
2
INTRODUCTION
3
INTRODUCTION
4
INTRODUCTION
5
INTRODUCTION
6
The Joint Probability Distribution: The combination probabilities of
a different variables.
C D
C D
9
A Set of Tables for Each Node:
Each node Xi has a conditional
A P(A) A B P(B|A) probability distribution P(Xi |
false 0.6 false false 0.01
Parents(Xi)) that quantifies the
true 0.4 false true 0.99
true false 0.7
effect of the parents on the
true true 0.3 node
The parameters are the
probabilities in these
conditional probability tables
A
(CPTs)
B C P(C|B)
false false 0.4 B D P(D|B)
B false false 0.02
false true 0.6
true false 0.9 false true 0.98
true true 0.1 C D true false 0.05
true true 0.95
10
A Set of Tables for Each
Node
Conditional Probability
Distribution for C given B
B C P(C|B)
false false 0.4
false true 0.6
true false 0.9
true true 0.1 For a given combination of values of the parents (B
in this example), the entries for P(C=true | B) and
P(C=false | B) must add up to 1
eg. P(C=true | B=false) + P(C=false |B=false )=1
11
Bayesian Networks
Two important properties:
1. Encodes the conditional independence relationships between the variables in the graph
structure
2. Is a compact representation of the joint probability distribution over the variables
12
Conditional Independence
The Markov condition: given its parents (P1, P2),
a node (X) is conditionally independent of its non-descendants (ND1,
ND2)
P1 P2
ND1 X ND2
C1 C2
13
The Joint Probability Distribution
Due to the Markov condition, we can compute the joint probability
distribution over all the variables X1, …, Xn in the Bayesian net using the
formula:
P(Xi|Xi-1,........., X1) = P(Xi |Parents(Xi ) )
14
Using a Bayesian Network Example:
A
These numbers are from the
conditional probability tables
B
C D
15
Inference
• Using a Bayesian network to compute probabilities is called
inference
• In general, inference involves queries of the form:
P( X | E )
E = The evidence variable(s)
16
Conclusion
There are innumerable applications to Bayesian Networks in
Spam Filtering, Semantic Search, Information Retrieval, and
many more. For example, with a given symptom we can
predict the probability of a disease occurring with several
other factors contributing to the disease. Thus, the concept of
the Bayesian Network is introduced in this article along with
its implementation with a real-life example
17
THANK YOU
18