0% found this document useful (0 votes)
6 views7 pages

Unit 4

The document provides an overview of probability theory, including definitions, approaches, laws, and applications. It covers key concepts such as classical, empirical, and subjective probability, as well as probability distributions like binomial, Poisson, and normal distributions. Additionally, it discusses calculations for event probabilities and the use of Bayes' theorem in various fields.

Uploaded by

sukhisidhu1376
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views7 pages

Unit 4

The document provides an overview of probability theory, including definitions, approaches, laws, and applications. It covers key concepts such as classical, empirical, and subjective probability, as well as probability distributions like binomial, Poisson, and normal distributions. Additionally, it discusses calculations for event probabilities and the use of Bayes' theorem in various fields.

Uploaded by

sukhisidhu1376
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

### **Theory of Probability**

Probability is a branch of mathematics that measures the likelihood of an event occurring. It


is used to predict outcomes in uncertain situations based on theoretical principles or
observed data.

---

### **1. Meaning of Probability**

- **Definition**: Probability is a measure of the likelihood or chance of an event happening. It


ranges from 0 (impossible event) to 1 (certain event).
- **Formula**:
\[
P(E) = \frac{\text{Number of favorable outcomes}}{\text{Total number of outcomes}}
\]

**Key Terms**:
1. **Experiment**: A process that leads to an outcome.
Example: Tossing a coin, rolling a die.
2. **Sample Space (\(S\))**: The set of all possible outcomes.
Example: For a die roll, \(S = \{1, 2, 3, 4, 5, 6\}\).
3. **Event (\(E\))**: A subset of the sample space representing one or more outcomes.
Example: Getting an even number when rolling a die (\(E = \{2, 4, 6\}\)).

---

### **2. Approaches to Probability**

1. **Classical Approach**:
- Based on equally likely outcomes.
- Formula:
\[
P(E) = \frac{\text{Favorable outcomes}}{\text{Total outcomes}}
\]
Example: Probability of getting a "head" in a coin toss = \( \frac{1}{2} \).

2. **Empirical (Relative Frequency) Approach**:


- Based on observations from experiments or past data.
- Formula:
\[
P(E) = \frac{\text{Number of times event occurs}}{\text{Total number of trials}}
\]
Example: If it rains on 30 days out of 100, \(P(\text{Rain}) = \frac{30}{100} = 0.3\).

3. **Subjective Approach**:
- Based on personal judgment or belief.
Example: The probability of a team winning a match based on expert opinion.
---

### **3. Calculation of Event Probabilities**

#### **Simple Events**:


- The probability of a single event occurring is calculated using the classical formula.
Example: Probability of rolling a 4 on a die = \( \frac{1}{6} \).

#### **Compound Events**:


- Probability involving two or more events.

1. **Mutually Exclusive Events**: Events that cannot happen simultaneously.


Example: Rolling a 2 and rolling a 3 on a single die are mutually exclusive.
\[
P(A \cap B) = 0
\]

2. **Independent Events**: Events where the occurrence of one does not affect the other.
Example: Tossing two coins.

---

### **4. Laws of Probability**

#### **i. Addition Law**:


- For **mutually exclusive events**:
\[
P(A \cup B) = P(A) + P(B)
\]
Example: Probability of rolling a 1 or 2 on a die:
\[
P(1 \cup 2) = P(1) + P(2) = \frac{1}{6} + \frac{1}{6} = \frac{2}{6} = \frac{1}{3}
\]

- For **non-mutually exclusive events**:


\[
P(A \cup B) = P(A) + P(B) - P(A \cap B)
\]

#### **ii. Multiplication Law**:


- For **independent events**:
\[
P(A \cap B) = P(A) \cdot P(B)
\]
Example: Probability of tossing two heads in two coin flips:
\[
P(H \cap H) = P(H) \cdot P(H) = \frac{1}{2} \cdot \frac{1}{2} = \frac{1}{4}
\]

- For **dependent events**:


\[
P(A \cap B) = P(A) \cdot P(B|A)
\]
Where \(P(B|A)\) is the conditional probability of \(B\) given \(A\).

---

### **5. Conditional Probability**

- **Definition**: The probability of an event \(B\) occurring, given that event \(A\) has already
occurred.
- **Formula**:
\[
P(B|A) = \frac{P(A \cap B)}{P(A)}, \quad P(A) \neq 0
\]
Example: Probability of drawing a red card from a deck given it’s a face card.

---

### **6. Bayes’ Theorem**

- **Definition**: Bayes’ theorem relates the probability of an event based on prior knowledge
of conditions related to the event.
- **Formula**:
\[
P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}
\]

**Applications**:
1. Medical diagnosis (probability of disease given symptoms).
2. Spam detection (probability of an email being spam given certain keywords).

---

### **Key Points**

| **Concept** | **Formula/Description** |
|------------------------------|-----------------------------------------------------|
| Classical Probability | \(P(E) = \frac{\text{Favorable outcomes}}{\text{Total outcomes}}\)
|
| Empirical Probability | \(P(E) = \frac{\text{Observed favorable cases}}{\text{Total
cases}}\) |
| Addition Law (Exclusive) | \(P(A \cup B) = P(A) + P(B)\) |
| Addition Law (Non-Exclusive) | \(P(A \cup B) = P(A) + P(B) - P(A \cap B)\) |
| Multiplication Law | \(P(A \cap B) = P(A) \cdot P(B|A)\) |
| Conditional Probability | \(P(B|A) = \frac{P(A \cap B)}{P(A)}\) |
| Bayes’ Theorem | \(P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}\) |

---

### **Applications of Probability**


1. **Business**: Risk analysis, decision-making, and forecasting.
2. **Healthcare**: Disease prediction and treatment success rates.
3. **Finance**: Stock market trends and insurance premium calculations.
4. **Artificial Intelligence**: Bayesian networks and machine learning.

Let me know if you need detailed examples or step-by-step calculations!

### **Probability Distributions**

A **probability distribution** describes how the probabilities of different outcomes of a


random variable are distributed. It is classified into **discrete** (e.g., Binomial, Poisson) and
**continuous** (e.g., Normal) distributions.

---

### **1. Binomial Distribution**

#### **1.1 Probability Distribution Function**


The binomial distribution models the number of successes in \(n\) independent trials, each
with a success probability \(p\).

\[
P(X = r) = \binom{n}{r} p^r (1-p)^{n-r}, \quad r = 0, 1, 2, \dots, n
\]
Where:
- \(n\): Number of trials
- \(p\): Probability of success in a single trial
- \(r\): Number of successes
- \(\binom{n}{r}\): Combination, given by \(\frac{n!}{r!(n-r)!}\)

#### **1.2 Constants of Binomial Distribution**


1. **Mean (\(\mu\))**: \( \mu = np \)
2. **Variance (\(\sigma^2\))**: \( \sigma^2 = np(1-p) \)
3. **Standard Deviation (\(\sigma\))**: \( \sigma = \sqrt{np(1-p)} \)

#### **1.3 Shape of Binomial Distribution**


- **Symmetrical**: When \(p = 0.5\).
- **Skewed**: If \(p \neq 0.5\):
- Right skewed when \(p < 0.5\).
- Left skewed when \(p > 0.5\).
#### **1.4 Fitting of Binomial Distribution**
To fit a binomial distribution:
1. Determine \(n\) and \(p\) from the data.
2. Compute the expected frequencies using the probability formula:
\[
f_i = n \cdot P(X = r)
\]
3. Compare observed and expected frequencies.

---

### **2. Poisson Distribution**

#### **2.1 Probability Distribution Function**


The Poisson distribution models the probability of a number of events occurring in a fixed
interval, where events occur independently and with a constant average rate (\(\lambda\)).

\[
P(X = r) = \frac{\lambda^r e^{-\lambda}}{r!}, \quad r = 0, 1, 2, \dots
\]
Where:
- \(r\): Number of events
- \(\lambda\): Average number of events per interval (mean)

#### **2.2 Constants of Poisson Distribution**


1. **Mean (\(\mu\))**: \( \mu = \lambda \)
2. **Variance (\(\sigma^2\))**: \( \sigma^2 = \lambda \)
3. **Standard Deviation (\(\sigma\))**: \( \sigma = \sqrt{\lambda} \)

#### **2.3 Poisson Approximation to Binomial Distribution**


The Poisson distribution approximates the binomial distribution when:
- \(n\) is large
- \(p\) is small
- \(\lambda = np\)

In such cases:
\[
P(X = r) = \frac{\lambda^r e^{-\lambda}}{r!}
\]

#### **2.4 Fitting of Poisson Distribution**


1. Compute the mean (\(\lambda\)) from observed data.
2. Use the Poisson formula to calculate probabilities for each \(r\).
3. Compute expected frequencies using:
\[
f_i = n \cdot P(X = r)
\]
4. Compare observed and expected frequencies.
---

### **3. Normal Distribution**

#### **3.1 Probability Distribution Function**


The normal distribution is a continuous probability distribution described by its mean (\(\mu\))
and standard deviation (\(\sigma\)).

\[
f(x) = \frac{1}{\sqrt{2\pi\sigma^2}} e^{-\frac{(x-\mu)^2}{2\sigma^2}}
\]
Where:
- \(x\): Random variable
- \(\mu\): Mean
- \(\sigma^2\): Variance

#### **3.2 Properties of the Normal Curve**


1. **Shape**: Bell-shaped and symmetric about the mean (\(\mu\)).
2. **Mean, Median, Mode**: All are equal and located at the center.
3. **Spread**: Determined by the standard deviation (\(\sigma\)).
4. **Total Area Under Curve**: Equals 1.
5. **Empirical Rule**:
- 68% of data lies within \(\mu \pm \sigma\).
- 95% within \(\mu \pm 2\sigma\).
- 99.7% within \(\mu \pm 3\sigma\).

#### **3.3 Calculation of Probabilities**


To calculate probabilities:
1. Standardize \(x\) to a **z-score**:
\[
z = \frac{x - \mu}{\sigma}
\]
2. Use the standard normal distribution table to find probabilities.

---

### **Comparison of Distributions**

| Feature | Binomial Distribution | Poisson Distribution | Normal Distribution


|
|-----------------------|-----------------------------|-------------------------------|------------------------------|
| **Type** | Discrete | Discrete | Continuous |
| **Parameters** | \(n\), \(p\) | \(\lambda\) | \(\mu\), \(\sigma\) |
| **Shape** | Symmetric or skewed | Right-skewed | Symmetric
(bell-shaped) |
| **Mean** | \(np\) | \(\lambda\) | \(\mu\) |
| **Variance** | \(np(1-p)\) | \(\lambda\) | \(\sigma^2\) |
| **Applications** | Pass/Fail, Surveys | Rare Events (e.g., defects) | Heights, Test
Scores |

---

### **Applications**
1. **Binomial**: Quality control, survey results, pass/fail scenarios.
2. **Poisson**: Defects in manufacturing, number of calls in a call center, rare events.
3. **Normal**: Heights, weights, exam scores, natural phenomena.

Let me know if you need help with detailed examples or calculations!

You might also like