Statistikskript VWL Final E v2 Slides
Statistikskript VWL Final E v2 Slides
Statistikskript VWL Final E v2 Slides
Economics
1
Contents
Statistics for Economics
Part I: Probability
1. Probability theory: the building blocks Slide 7
1.1. Events and the sample space Slide 8
1.2. Relations of set theory Slide 19
1.3. The concept of probability Slide 27
1.4. Axiomatic definition of probability Slide 41
1.5. Basic theorems Slide 44
1.6. Probability spaces Slide 50
1.7. Conditional probability and stochastic independence Slide 64
1.8. Law of total probability Slide 77
1.9. Bayes’ theorem Slide 85
2. Combinatorial methods Slide 93
2.1. Factorials and binomial coefficients Slide 94
2.2. Multiplication rule Slide 97
2.3. Permutations Slide 99
2.4. Combinations Slide 101
2.5. Sampling with replacement Slide 103
2
3. Random variables Slide 105
3.1. The (cumulative) distribution function Slide 117
3.2. Discrete random variables Slide 125
3.3. Continuous random variables Slide 129
3.4. The expectation of a random variable Slide 138
3.5. Variance Slide 156
3.6. Standardization Slide 169
4. Special distributions Slide 173
4.1. The uniform discrete distribution Slide 176
4.2. The Bernoulli distribution (discrete) Slide 183
4.3. The Binomial distribution (discrete) Slide 187
4.4. The Poisson distribution (discrete) Slide 195
4.5. The uniform continuous distribution Slide 201
4.6. The exponential distribution (continuous) Slide 205
4.7. The normal distribution (continuous) Slide 209
5. Multivariate random variables Slide 218
5.1. Joint distribution and marginal distributions Slide 223
5.2. Conditional distributions and stochastic independence Slide 240
5.3. Covariance and correlation Slide 248
5.4. Sums and sample means of random variables Slide 254
6. The Central Limit Theorem Slide 261
3
Pert II: Statistics
7. Descriptive statistics Slide 278
7.1. Frequency tables, histograms, and empirical distributions Slide 281
7.2. Summarizing data using numerical techniques Slide 289
7.3. Boxplot Slide 302
7.4. Quantile-Quantile-plot Slide 305
7.5. Scatter diagram Slide 310
8. Estimation of unknown parameters Slide 313
8.1. Intuitive examples of estimators Slide 318
8.2. Properties of estimators Slide 328
8.3. Main methods to get estimators Slide 346
9. Confidence intervals Slide 361
9.1. The idea Slide 362
9.2. Example of a confidence interval
(mean of a distribution, large samples) Slide 366
9.3. Relation with testing hypotheses Slide 370
4
Part I: Probability
5
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
7
1.1. Events and the sample space
8
Definition
9
Definition
sample space
of the experiment.
10
Example 1.1.1:
) When a six-sided die is rolled:
S = {1, 2, 3, 4, 5, 6}
) If we flip a coin twice:
S = {HH, HT, TH, TT}
) If we flip a coin until we get a head:
S = {H, TH, TTH, TTTH, ....}
(countably many elements)
11
) For the lifespan of a light bulb we have:
S = {t : t ≥ 0} = [0,∞ ) = R +
S = {all functions: R + → R + }.
12
Remark:
13
Example 1.1.2:
Two coins are tossed:
→ if we are interested in the outcomes heads or tails
of the two coins:
S1= {(H , H) , (H, T ), ( T, H), (H, H)}
→ if, instead, we count the number of heads/tails:
S2 = {( 2, 0 ) , (1, 1), ( 0, 2)}
→ finally, if we only want to see whether they show the
same (s) or a different (d) result:
S3 = {{s}, {d}}.
14
Definition
Definition
An event
Ein A is a well-defined,
zufälliges Ereignis A istarbitrary
eine Teilmenge
set of des
possible
Ereignisraumes
outcomesS.of the experiment. It is a subset
of the sample space S.
Das Ereignis A ist eingetreten, wenn das Ergebnis
Wedessay
Zufallsexperimentes
that an event A occurred
ein Element
if the outcome
dieser of
the
Teilmenge
experiment
A ist.
is an element contained in A.
15
Definition
16
We assume that two specific events must always be
contained in E:
17
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
Complement: Ā
is the set that contains all elements of S
that do not belong to A.
19
Definition
S
Ā
20
Definition
Union: A∪B (‘‘A or B’’) is defined to be the set
containing all outcomes that belong to A alone, to B
alone, or both A and B.
A B
21
Definition
Intersection: A ∩ B (‘‘A and B’’) is defined to be the
set that contains all outcomes that belong both to A
and to B.
A A∩B B
22
Definition
Disjoint events 23
Definition
A\B
24
Definition
A
C
C is contained in A
⇔C⊂A
S
C is contained in A
25
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
P: E ℝ
A P(A)
is a real-valued function:
To each event in E is assigned exactly one
element of ℝ.
27
1) The subjective interpretation of probability
28
2) The frequency interpretation of probability
29
Definition
30
Example 1.3.1:
A die is rolled 3,000 times in succession. A running
tally is kept of the number of times we get a ‘‘6’’.
What do we expect for P (6)?
" "
31
Illustration of Example 1.3.1:
32
3) The classical interpretation of probability
33
The classical interpretation of probability is based
on the concept of equally likely outcomes.
34
# outcomes belonging to A |A|
P(A) = =
# possible outcomes n
35
Definition
Laplace experiment.
36
Example 1.3.2:
We toss a die and a coin simultaneously.
We want to compute the probability of the event
A = "heads and number larger than 4"
37
→ Alternative: Elementary events:
(H,1) , ( T,1) , (H,2 ) , ( T,2 ) ,…, (H,6 ) , ( T,6 )
are equally likely!
2
⇒ A= {(H,5 ) , (H,6 )} and P ( A )= 12
= 1
6
38
Example 1.3.3:
When flipping a coin twice:
39
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
P: E ℝ
A P(A)
that assigns to each event A in E a real number is
called probability function (or measure).
P(A) is called the probability of the event A when
the following axioms hold (Kolmogorov, 1933):
41
Definition (continued)
Axiom 2: P (S ) = 1
42
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
Theorem 1
The probability of the complement of an event A is
given by
P(Ā) = 1 - P(A), for each event A є E
Theorem 2
The probability of the impossible event is given by:
P(Ø) = 0
44
Theorem 3
For every finite sequence of n pairwise disjoint events
A1,A2, ..., An є E the probability of the union of the
events equals the sum of the individual probabilities.
That is
n n
P A i = ∑ P ( A i ).
i=1 i=1
Theorem 4
For an event resulting from a difference A\B we have
that P(A\B) = P(A) – P(A∩B).
45
Theorem 5 (addition rule)
For every two events A and B in E we have that
P(A∪B) = P(A) + P(B) – P(A∩B).
Impliziert
Theorem 6 (monotonicity property)
If an event A is contained in an event B, then its
probability will never be larger than that of B, that is
46
Example 1.5.1:
S = {000,…,999} , S = # outcomes in S = 10 3
47
We consider the event
then:
A
( )
Th.1
P (A) = 1− P A = 1− , und
S
A = # three digits numbers with all
different digits = 10 ⋅ 9 ⋅ 8 = 720
720
⇒ P (A) = 1− = 0.28
1000
48
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
50
Let us now consider P() a probability function that
satisfies the axioms.
That is:
e 1 e2 e3 ..... ei .....
p1 p2 p3 ..... pi .....
51
(1) pi ≥ 0 for each i = 1,2,...
(2) ∑ pi = 1, because ei ∩ ej = Ø ∀ i ≠ j
all i
(ei pairwise disjoint) and ∪ ei = S
all i
Th. 3
∑ pi = P(∪ ei) = P(S) = 1
alle i all i
(3) P(A) = ∑ pi
eiєA
pi 1
e1 , , e m ;= ∀ i=1, , m
m
m
⇒ ∑ i
p =m ⋅ 1 =1
i=1
m
53
2. Probability space with infinitely, countably many
outcomes:
Experiment: “Flip a coin until we get an head’’.
= ( H ) 1 2=
p1 P= TH ) 1 =
, p 2 P (=
4
, p3 1 ,…
8
1
pi P=
TT.....TH
2i
i
i
∞
1 ∞
1∞
⇒ ∑ pi = ∑ pi = ∑ i = ∑ =1,
alle i i=1 i=1 2 i=1 2
1 =p
2 1
Area of the
square = 1
1
1 =p 8
4 2
......
1
16 ......
55
General probability spaces
Definition
Ist (continuous
ein Ereignisraum sample space)
S nibzälbar, heisst er stetig.
56
Example 1.6.2:
0 1
57
Let us now conduct an experiment constructed as
follows: choose randomly an arbitrary real number
0 ≤ a ≤ 1 in the interval [0,1].
In any case we have that
P ( a ) =0 ∀a ∈ S= [ 0,1] .
A = {a | a < 0.4}
B = {a | 0.6 < a < 0.9}
C = {a | a > 0.8}
58
Intuition:
P ( A ) = 0.4; P ( B ) = 0.3 und P ( C ) = 0.2. Why?
→ S can be seen as a Laplace continuous sample space,
where all real numbers are equally likely to be chosen.
length A
Thus for an event A: P ( A ) = ;
length S B
C
length B ∩ C 0.1
P [ B ∩ C] = = = 0.1
length S 1 0 0.5 0.6 0.8 0.9 1
Th.5
P [ B ∪ C] = P ( B ) +P ( C ) − P [ B ∩ C] =0.3+0.2-0.1=0.4
59
Remark:
The intervals (events) can be open or closed.
For example
60
2. Define S= {( a,b ) | 0 < a < 3 and 0 < b < 2}
D = {( a,b ) | 0 < a < b < 2}
K
S
a
61
Using the classic definition of probability:
Area of D 2 1
P ( D) = = =
Area of S 6 3
Area of K π
P (K) = = ≅ 0.5236
Area of S 6
62
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
64
Example 1.7.1:
Rolling a die:
→ P ("6") = 1
6
And when we additionally know that the resulting
number is even?
65
Definition (conditional probability)
66
Example 1.7.2:
67
(1,1) (1,2) ...... ...... ...... (1,6)
(2,1) (2,2) ...... ...... ...... (2,6)
(3,1) (3,2) ...... ...... ...... (3,6)
(4,1) (4,2) ...... ...... (4,5) (4,6)
(5,1) (5,2) ...... ...... (5,5) (5,6)
(6,1) (6,2) ...... (6,4) (6,5) (6,6)
11
A = "at least one 6": P ( A ) =
36
6 5
B = "sum > 9": P ( B) = and P ( A ∩ B ) =
36 36
P ( A ∩ B) 5/36
5
⇒ P ( A B) = = =
P ( B) 6/36 6
68
Theorem 7 (multiplication rule)
Let A and B be events with positive probability. The
probability of the intersection of A and B is given by
69
Example 1.7.3:
3 2 1
P [ R1 ∩ R 2 ] = P [ R1 ] ⋅ P R 2 R1 = ⋅ = .
42 3 2
70
Definition (stochastic independent events)
P(A I B) = P(A).
P(B I A) = P(B).
71
Theorem 8 (multiplication rule for independent events)
72
Example 1.7.4:
73
Example 1.7.5:
74
Remark:
Stochastic independence is not a transitive relation!
From "A and B independent" and "B and C
independent" does not necessarily follow
"A and C independent"!
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
A ∩ M1
A="defective article" M i : "produced by M i"
A ∩ M2
M2 M1
78
We have (Theorem 8, multiplication rule):
P(A ∩M
=i) P ( A Mi ) ⋅ P (Mi ) , i = 1,2
P ( A ) = P ( A ∩ M1 ) +P ( A ∩ M2 )
= P ( A M1 ) ⋅ P (M1 ) +P ( A M2 ) ⋅ P (M2 )
= 0.1⋅ 2 + 0.07 ⋅ 1 =0.09
3 3
79
Definition:
Definition (partition)
80
Theorem 9 (law of total probability)
H1 H2
...
A
H3
Hn
H4 ...
...
81
Example 1.8.2:
In Example 1.7.3, an urn with 4 balls (3 red and 1
blue):
R1 = "first ball drawn is red"
and
R1 = "first ball drawn is blue"
form a partition of S.
We can therefore compute the probability of the event
R2 = {"second ball drawn is red"} as
( ) ( )
P (R2 ) = P (R2 R1 ) ⋅ P (R1 ) + P R2 R1 ⋅ P R1
2 3 3 1 3
= ⋅ + ⋅ =
3 4 3 4 4
82
Remark:
Computations like those appearing in the law of total
probability can often be visualized using a tree
diagram.
R 2 red-red
R1
R 2 red-blue
R 2 blue-red
R1
R 2 blue-blue (impossible)
83
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
P (B | Hi )P (Hi )
P (H i | B ) =
∑
n
j =1
P (B | H j )P (H j )
85
Example 1.9.1: (identifying defective items)
In Example 1.8.1 the probability that an article randomly
chosen from the total production is produced by
machine M1 was (a-priori)
2
P ("article produced by machine M1")= = 0.66.
3
If we now observe that the chosen article is defective,
we will surely increase that probability, given that
machine M1 leaves a larger portion of defective items
behind.
86
Example 1.9.1 (continued):
From Bayes’ theorem we get: 2
0.1 ⋅
3
P ( "article from M1" | " article defective" ) =
2 1
0.1 ⋅ + 0.07 ⋅
3 3
20
= = 0.741
27
( )
prior probability
→ P ( H1 ) 0.001 and =
= P H1 0.999
law of total
) ( )
probability
(
P ( B ) P ( B H1 ) ⋅ P ( H1 ) + P B H1 ⋅ P H1
⇒ =
= 0.9 ⋅ 0.001 + 0.01 ⋅ 0.999 = 0.01089 and
Bayes' theorem
( )
P B H 1 ⋅ P ( H 1 ) 0.9 ⋅ 0.001
⇒ P ( H1 B )
= = = 0.0826,
P ( B) 0.01089
that is the reliability of a positive response
is about 8.3%.
90
Tree diagram:
Test T
Condition
B
H1
B
B
H1
B
91
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
93
2.1. Factorials and binomial coefficients
Definition
94
Definition
n n!
=
k k! (n – k)!
95
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
n1 n2 n3 nk.
97
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
Definition
101
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
Definition
103
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
105
Definition
106
S
xєℝ
-1 0 1 2 3
107
Example 3.0.1:
If we roll one die once, we have that
S = {1, 2,…,6}.
X (e) = e .
108
Example 3.0.2:
A coin is tossed once.
Let X denote the number of heads. X has only
two possible values:
X ( "tail" ) = 0 und X ( "head" ) = 1.
The set of events E includes the four events:
E = {0,
/ "tail", "head", S} .
S = {{-}, {X},{Y},{Z},{XY},{XZ},{YZ},{XYZ}}
→ W = {0, 1, 2, 3}
112
Example 3.0.5: (revenue under uncertain conditions)
113
Example 3.0.5 (continued):
A 10 0.8
B 14 0.5
C 24 0.75
Image of X ?
114
Order positions Revenue P({ei})
(ei) (X(ei))
− 0 0.025
A 10 0.1
B 14 0.025
C 24 0.075
AB 24 0.1
AC 34 0.3
BC 38 0.075
ABC 48 0.3
Σ=1
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
Definition
117
Example 3.1.1:
X = “number of heads when tossing a coin once’’.
1
0 , if x < 0
1 ½
F(x) = , if 0 ≤ x <1
2
1 , if x ≥ 1 0 1
118
Example 3.1.2:
Two dice are rolled once.
Let Y be ‘‘the absolute difference between the two numbers’’.
Then:
Y = 0 , if (i, j) = (1,1); (2,2); (3,3); ...; (6,6) : 6 couples
Y = 1 , if (i, j) = (1,2); (2,1); ... : 10 couples
Y = 2 , if (i, j) = (1,3); (3,1); (2,4); ... : 8 couples
Y = 3 , if (i, j) = (1,4); (4,1); ... : 6 couples
Y = 4 , if (i, j) = (1,5); (5,1); (2,6); (6,2) : 4 couples
Y = 5 , if (i, j) = (1,6); (6,1) : 2 couples
36 couples
119
Example 3.1.2 (continued):
6 6
Thus: P ( Y 0==) = 16 ; P ( Y 3=) = 1 ;
36 36 6
10 5 4
P ( Y= 1=
) = ; P ( Y= 4=
) = 1 ;
36 18 36 9
8 2 2
P(Y 2 =
=) = ; P ( Y 5=) = 1 ;
36 9 36 18
120
0 , y<0
1/6 , 0 ≤ y < 1
4/9 , 1 ≤ y < 2
⇒ F ( y ) =2/3 , 2 ≤ y < 3
5/6 , 3 ≤ y < 4
17/18 , 4 ≤ y < 5
1 , y≥ 5
121
F(y)
2/3
1/3
0 1 2 3 4 5
122
Properties of the distribution function:
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
(3) f(xi) ≤ 1.
126
Remark:
For real-valued intervals, we can generally compute
the probabilites using the following formula:
∑ p ( xi )
P ( a<X ≤ b ) = F ( b ) − F ( a ) =
a < xi ≤b
P ( a < X < b ) = F (b ) − F ( a ) − f (b )
P ( a ≤ X ≤ b ) = F ( b ) − F ( a ) +f ( a )
P ( a ≤ X < b ) = F ( b ) − F ( a ) +f ( a ) − f ( b )
127
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
∫
P ( a ≤ X ≤ b ) = f ( x )dx.
a
129
Every density function satisfies the following properties:
130
Example 3.3.1:
0 , x<0
1
F ( x ) = ( x - 3 )3 + 1 , 0 ≤ x < 3
27
1 , x ≥ 3
131
Example 3.3.1 (continued):
0 , x<0
1
f ( x ) = ( x - 3 )2 , 0 ≤ x<3
9
0 , x ≥ 3
132
Example 3.3.1 (continued):
f(x)
F(x)
1
1
0.5 0.5
0.2593
x x
0 1 2 3 0 1 2 3 4
133
What is the probability P (1 ≤ X ≤ 2 ) ?
P ( 1 ≤ X ≤ 2 )= F ( 2 ) − F ( 1)=
−1
27
+ 1− ( ( −2 ) 3
27
+1= ) 27
7
= 0.2593
or
2
1 −1 −8 7
2
1
P (1 ≤ X ≤ 2 ) = ∫ dx =
2
(x-3) = − = .
3
(x − 3)
1
9 27 1 27 27 27
Let us define
X = ‘‘waiting time at the station’’
as the random variable of interest.
The image set of X is W = [0,12] (minutes).
135
1 , if x ∈ [ 0, 12]
Density function: f ( x ) = 12
0 , else.
Distribution function:
0 , x<0 x x
x 1 u x
F(x) 12 , 0 ≤ x ≤ 12 ← ∫ = du =
12 12
0
12 0
1 , x > 12
10
Then: P (10 < X < 15 ) =F (15 ) − F (10 ) =−
1 =0.1667
12
9
P ( X > 9 ) =−
1 F ( 9 ) =−
1 =0.25
12 136
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
Definition
Let X be a random variable and f be its probability
or density function (discrete or continuous X).
Satz:
139
Example 3.4.1:
140
Example 3.4.2: (rolling two dice)
27 − 1 26 13
= = = .
12 12 6
142
The expectation of a function of a random
variable (law of the unconscious statistician)
143
Example 3.4.4:
Breakdowns are observed during the activity of a
production center.
X 0 1 2 3
f( x ) 0.35 0.4 0.15 0.1
144
Example 3.4.4 (continued):
But:
The correct way to compute the expected costs is
E [ g ( X )]
= g ( 0 ) ⋅ 0.35 + g (1) ⋅ 0.4 + g ( 2 ) ⋅ 0.15 + g ( 3 ) ⋅ 0.1
11
1 ⋅ 0.35 + 3 ⋅ 0.4 +
= ⋅ 0.15 + 4 ⋅ 0.1
3
= 2.5.
146
Example 3.4.5:
X 0 1 2 3
fx 0.1 0.3 0.2 0.4
Y= ( X-2 )2 → fy ? Wy = {0, 1, 4}
Y 0 1 4
→ E [ Y ] = 0.7 + 0.4 = 1.1
fy 0.2 0.7 0.1
147
Computing expectations: linear function
Theorem
148
Example 3.4.6:
∫ xe ( -x.e )0 + ∫
-x +∞
E [ X] = dx = -x
e -x
dx = -e -x ∞
0 1
=
↓↑ P.I.
0 0
∞ y-1 y-1
1 −1
− − ∞
∫
21
e dy =
2
2
⋅2 e 2
1
1
=
∞ y-1 y-1
1 − 1 − ∞
=
Then: E[Y]
2 ∫y e 2
dy = −
↑P.I. 2
⋅2 ⋅y⋅e 2
1
1
∞ y-1 y-1 ∞
− −
+ ∫e 2
dy =1+2 ⋅ ( − e 2
) 3.
=
1 1
151
Example 3.4.7:
Let us consider a random variable X with p.f.
X -1 0 1.5 2
f (x) 0.3 0.1 0.4 0.2
⇒ E [ X=
] ( −1) ⋅ 0.3 + 0 ⋅ 0.1 + 1.5 ⋅ 0.4 + 2 ⋅ 0.2= 0.7
E [ X+3=
] E [ X ] += 3 3.7
E [ 4X ] =4 E [ X] =4 ⋅ 0.7 =2.8
152
Example 3.4.8: (rolling die game)
153
Example 3.4.8 (continued):
“Fair game’’ means that the fee one has to pay
exactly equals the expected gain. Let X denote the
gain (in cents).
X=x 10 20 40 80
f (x) 2 2 1 1
6 6 6 6
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
Definition
Let X be a random variable with finite mean μx.
The variance of X is defined as follows:
2
σx = V(X) = E[(X - μx)2],
157
Example 3.5.1: (flipping a coin)
Then: X 0 1 2
f (x) 1 1 1
4 2 4
and µ x = 1
1 2 1
2
σ
= ( 0-1) ⋅2
+ (1 − 1) ⋅ + ( 2 − 1)2 ⋅ 1= 1
X 4 2 4 2
158
Example 3.5.2: (rolling two dice)
159
Example 3.5.3: (continuous case)
c 1 2
x − x , 0<x<2
f (x) = 2
0 , else.
160
Example 3.5.3 (continued):
2
1 2 x2
2 x 3
2
8 2 !
c ⋅ ∫ x- x dx = c − = c2 − = c = 1
0 2 2 0 6 0 6 3
⇒ c =3
2
161
Example 3.5.3 (continued):
Then:
3
( ) 3 8
2 3 4
1 3 x x
E [ X] =
2 2
⋅ ∫ ⋅ − = ⋅ − 2 =1
2 3
x - x dx=
2 0
2 2 3 0 8 0
2 3
and
3 1 2 3 5 2 1 4
2 2
V ( X ) = ∫ ( x-1) ⋅ x- x dx = ∫
2 3
x- x +2x - x dx
20 2 2 0 2 2
3
40 16 32 1 1 = 0.4472.
= 2- + - = 5 ; σx= 5
2 6 2 10
162
Computation of variances: simple rules
163
Example 3.5.4:
Let us consider the random variable X with p.f.
X 6060 6100 6140
f (x) 0.2 0.3 0.5
We then get:
E [ Y ] = 0.3
V ( Y ) ( −1 − 0.3 ) ⋅ 0.2 + ( 0 − 0.3 ) ⋅ 0.3
= 2 2
+ (1 − 0.3 ) ⋅ 0.5 =
2
0.61
V ( 6100 + 40Y ) =
⇒V (X) = 40 ⋅ V (Y ) 2
= 40 ⋅ 0.61 =
2
976
165
Theorem (alternative method for computing variances)
For every random variable X: V(X) = E[X2] - μx2.
( )
2
210 70
⇒ V (Y) = − =5.83 − 3.78086 =2.05247. 166
36 36
Steiner rule:
Let X be a random variable with E[X] = μ and
let d be a real-valued constant. Then
167
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
Definition
169
Every standardized random variable has
expectation 0 and variance 1:
translation stretching 1 X-μ
X Y=X–μ Z= σ Y= σ
170
Example 3.6.1: (flipping a coin)
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
173
Several distributions play a special role in probability
and statistics: they are known to be useful in a wide
variety of applied problems.
174
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
X x1 x2 x3 … xm-1 xm
f(X) 1/m 1/m 1/m .... 1/m 1/m
Notation: fUni(x; m)
parameter of the family
176
Example 4.1.1: (rolling one die)
is given by
1
, x = 1, 2,...,6
f Uni ( x;6 ) = 6
0, else.
177
Example 4.1.1 (continued):
What about the distribution function?
FUni(x,6)
5/6
4/6
3/6
2/6
1/6
0 1 2 3 4 5 6
178
Example 4.1.1 (continued):
179
Example 4.1.1 (continued):
2 1 m 2 1 6 2 1 ( 6 + 1)( 6 ⋅ 2 + 1) ⋅ 6 7 ⋅ 13 91
E X =⋅ ∑ x =⋅ ∑ i =⋅ = =
m i 6 i=1 6 6 6 6
i=1
m 2
2 2 1 2 1 m
V ( X) =
E X − E [ X] =⋅ ∑ x − ⋅ ∑ x
m i=1 i m i
i=1
91 7 2 91 49 182 − 147 35
= − = − = = = 2.9167
6 2 6 4 12 12
σ
= V (X)
= 2.9167
= 1.7078
X
180
Remark: (no additive property)
2 3 4 12
1 2 3
← not equally likely!
36 36 36
not a uniform discrete distribution!
181
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
1 − p, x= 0
fBe ( x; p ) =
= p, x 1
0, else.
184
Example 4.2.1:
()
3
5 216 − 125 91
p =1 - P [ "no six" ] =
1− = = .
6 216 216
91
⇒ E X = = 0.4213,
216
91 91
( )
V X =⋅ 1 −
216
=
216
0.2438,
σ X = 0.4938.
185
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
Definition
n x x = 0,1,2,..., n
fBi ( x; p, n )= ⋅ p ⋅ (1 − p ) ,
n−x
x 0 ≤ p ≤ 1.
187
Derivation of the binomial distribution:
Let Yi, i=1,…n, be independent random
variables, each one Bernoulli distributed
with parameter p (Bernoulli trials).
188
Example 4.3.1: (urn with replacement)
189
Example 4.3.1 (continued):
→ possible outcomes of X: x = 0, 1, 2, 3, 4
10 1
→ parameter values: p = = ; n=4
30 3
4 0 16
= p (1 − p= )
4 −0
→ P(X=0) ;
0 81
4 1 32
p (1 − p )
4 −1
P(X=1) =
= ;
1 81
190
Example 4.3.1 (continued):
That is:
1
X is binomially distributed with fBi x; ,4
3
1 4
→ E [ X] = n ⋅ p = 4 ⋅ =
3 3
1 2 8
→ V ( X ) = n ⋅ p ⋅ (1 − p ) = 4 ⋅ ⋅ =
3 3 9
191
Example 4.3.2: (election)
192
Example 4.3.2 (continued):
Standard deviation:
σ = 12 ⋅ 0.35 ⋅ 0.65 =
1.6523.
193
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
λ x −λ
⋅ e , for x = 0,1, 2,...,
fPo ( x; λ ) = x !
0, else.
195
Example 4.4.1: (roulette)
196
Example 4.4.1 (continued):
λ np
Let us use the Poisson distribution with = = 5.4054.
5.40548 −5.4054
Then, we find: fPo ( 8 ; λ ) = e = 0.0812.
8!
1
fBi 8 ; , 200 = 0.0814.
37
197
Example 4.4.2: (minigolf)
198
Example 4.4.2 (continued):
Let us compute the probability that Mr. Findhole
makes a hole in one in only 38 out of the 50
attempts: (a) exact; (b) with a suitable approximation.
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
201
Example 4.5.1:
According to schedule, a bus is expected to arrive
every 30 minutes between midnight and 6am.
What is the probability that a passenger has to wait
more than 10 minutes?
T = "waiting time for the next bus" ,
1
, 0 ≤ t ≤ 30,
is a random variable with fUni ( t;0,30 ) = 30
0 , else.
10 2
Thus: P [ T>10] =−1 P [ T ≤ 10] =−
1 FUni (10 ) =−
1 =
30 3
Moreover: E [ T ] 15min
= = and V ( T ) 75min2
202
Example 4.5.2: (waiting time at the ‘S-Bahn’ station)
Trains are coming every 12 minutes at a given
‘S-Bahn’ station. Suppose you do not know the exact
schedule and arrive at the station at a randomly
chosen point in time.
Let us define
X = ‘‘waiting time at the station’’
as the random variable of interest.
The image set of X is W = [0,12] (minutes).
203
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
Definition
205
Example 4.6.1: (life test)
x
1 −
=fEx ( x ) e 5000 , 0 ≤ x < ∞.
5000
1
Remark: λ =
E[X]
206
Example 4.6.1 (continued):
( )
P [ X > 10000] =1 − FEx (10000 ) =1 − 1 − e −2 =0.1353.
207
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
209
Definition (general normal distribution)
A random variable X has the normal distribution
with mean μ and variance 𝜎𝜎 2 (−∞ < μ < ∞; σ > 0) if
its density function is defined as follows:
2
( x −µ )
1 -
fN ( x; µ ,σ 2
)= e 2σ 2 , − ∞ < x < ∞.
2πσ 2
μ = -5
μ=0
μ=5
211
Example 4.7.1: (working with normal distribution)
-2 - µ X - µ 4 - µ
P ( -2 < X ≤ 4 ) = P < ≤
σ σ σ
7 -1
= P - < Z ≤
3 3
1 7
= FZ - - FZ -
3 3
= 0.3694 - 0.0098 = 0.3596
212
Example 4.7.1 (continued):
The probability P(-2 < X ≤ 4) for different values of
E[X] = μ und V(X) = σ2.
σ=1 σ=2 σ=3
μ = -5
μ=0
μ=5
213
Example 4.7.2: (finance: asset allocation)
It is usually assumed in the portfolio theory that financial
asset returns are normally distributed random variables.
→ E [R ] = µ expected return;
σ R = V ( R ) volatility (risk).
An investor wants to invest a certain amount of money in
three different shares:
R2 − µ2 0 − 0.36
P ( R2 < 0 ) P
= <= P= ( Z < -1.8 ) 0.0359
σ2 0.2
R3 − µ3 0 − 0.1
P ( R3 < 0 ) P
= <
= P=( Z < -2.5 ) 0.0062
σ3 0.04
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
218
To formalize many underlying theories as well as to
solve many applied problems, one also needs to
consider the relation among the different random
variables under investigation. In fact, that information
might play a prominent role and cannot be
neglected.
219
Example 5.0.1: (finance: portfolio selection)
221
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
(1) f ( xi , y j ) ≥ 0
( 2 ) ∑∑ f ( xi , y j ) = 1 ⇒ ( 3 ) f ( xi , y j ) ≤ 1, ∀i , j
∀i ∀ j
P ( ( X ,Y ) ∈ C ) =∑ f ( xi , y j ).
( xi ,y j )∈C
224
Example 5.1.1: (urn without replacement)
U
Two balls are drawn from the urn without
replacement. The joint probability of
(X, Y) = (" label first ball ", " label second ball ")
is: [see next slide]
225
Example 5.1.1 (continued):
227
Example 5.1.3: (tossing a coin)
→ # outcomes: 24=16
TTTT: (0,0); TTTH: (1,1); TTHT: (1,2); THTT: (1,2); HTTT: (1,1);
TTHH: (2,1); THTH: (2,3); THHT: (2,2); HTHT: (2,3); HHTT: (2,1);
HTTH: (2,2); THHH: (3,1); HTHH: (3,2); HHTH: (3,2); HHHT: (3,1);
HHHH: (4,0).
228
Example 5.1.3 (continued):
Y y1 = 0 y2 = 1 y3 = 2 y4 = 3 fx
X
x1 = 0 1/16 0 0 0 1/16
x5 = 4 1/16 0 0 0 1/16
fx ( xi ) = P [ X = xi ] = ∑ f ( xi ,y j ) = pi ,• ;
j
fy ( y j ) = P Y = y j = ∑ f ( xi ,y j ) = p•, j ;
i
230
Continuous random variables:
Let X and Y be continuous random variables. The
function f(x,y) with
b d
x2 +y 2
1 −
f ( x, y=) ⋅e 2
, − ∞ < x, y < +∞.
2π
232
Example 5.1.5:
The joint density function of (X, Y) is given by
12 2
( x + xy ) , if 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1;
f ( x,y ) = 7
0, else.
→ non-negativity property ?
→ 12
( )
2
12 xy 12
1 1 1 1
1 1
x
∫ ∫ ( x +xy ) dydx = ∫ x y + ∫
2
2
dx = 2
x + dx
7 0 0
7 0 0 2 0 7 0 2
12 x 3 x
( )
12 1 1 12 7
2 1
= + = + = ⋅ = 1
7 3 4 0 7 3 4 7 12
233
In the continuous case, the marginal (probability)
density functions of X and Y are
∞
fx ( x ) = ∫ f ( x, y ) dy
−∞
and ∞
fy ( y ) = ∫ f ( x, y ) dx,
−∞
respectively.
234
Example 5.1.5 (continued):
1
1
12 12 y 2
fx ( x ) ∫
7 0
( x + xy ) dy =
2
⋅ x y + x
7
2
2 0
12 2 x
= ⋅ x + , x ∈ [0,1].
7 2
1
1
12 12 x 3
x 2
y
fy ( y ) ∫
7 0
( x + xy ) dx = ⋅ +
2
7 3
2 0
12 1 y
= ⋅ + , y ∈ [0,1] .
7 3 2
235
Remark:
The expected values and variances of the marginal
distributions of a bivariate random vector can be
computed using the marginal probability/density
functions:
→ µ x = E [ X ]= ∑ x i fx ( x i );
i
∑ ( x -µ ) f ( x )
2
σx
= ( X)
V=2
i x x i
( discrete )
i
∞
→ µ x = E [ X ]= ∫xf x
( x ) dx;
−∞
∞
∫ ( x-µ )
2
σx
= ( X)
V=2
x
fx ( x ) dx ( continuous )
−∞
236
Definition
F ( x, y ) = P [ X ≤ x, Y ≤ y ] .
237
Practical computation:
If (X,Y) is discrete:
F ( x, y ) = ∑ ∑ f ( x , y ).
xi ≤ x y j ≤ y
i j
If (X,Y) is continuous:
x y
( x, y )
F= ∫ ∫ f (u, v ) ⋅ dv ⋅ du.
−∞ −∞
238
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
f=
( )
f X ,Y xi , y j pi , j
( xi ) = p , if p⋅, j > 0, and 0 else.
X Y =y j
( )fY y j ⋅, j
Another example: 1
fY |X=0 (y) : Y = 0:
4
1
Y = 1:
2
1
Y = 2:
4
Y = 3: 0 ...
242
Example 5.2.2: (continuous case)
Let (X,Y) denote a two-dimensional continuous
random vector with joint density given by
λ 2 ⋅ e − λ x , 0 ≤ y ≤ x,
f ( x,y ) =
0 , else.
→ Marginal densities:
x
fx ( x ) =∫ λ 2 ⋅ e − λ x dy =λ 2 ⋅ e − λ x ⋅ x, x ≥ 0
0
+∞
1 − λ x +∞
fy ( y ) =
∫ λ ⋅e
2 −λ x
λ ⋅ − e =
dx = 2
λ ⋅ e−λy , y ≥ 0
y λ y
(→ Y is exponentially distributed: Exp ( λ ) ) .
243
Example 5.2.2 (continued):
A = {( x, y ) | x ∈ [0, ∞ ), y ∈ [0, x ]}
or
A = {( x, y ) | y ∈ [0, ∞ ), x ∈ [ y , ∞ )}
244
Example 5.2.2 (continued):
245
Definition (independent random variables)
f X ,Y ( =
x, y ) f X ( x ) ⋅ fY ( y ) , − ∞ < x, y < +∞.
It also follows that (for all y with 𝑓𝑓𝑌𝑌 (y)>0 and all x
withf(x,y) = respectively)
𝑓𝑓𝑋𝑋 (x)>0, f x (x) ⋅ f y (y), für ∀(x,y).
= ( x ) and fY |X=x ( y ) fY ( y ).
f X|Y =y ( x ) f X=
246
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
248
We introduce summaries of a joint distribution that
enable us to measure the relationship between two
random variables, i.e. their tendency to vary together
rather than independently.
Definition (covariance)
252
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
= ∑∑ x
∀i ∀j
i ⋅ f (xi ,y j ) + ∑∑ y
∀i ∀j
j ⋅ f(xi ,y j )
= E [ X ] + E [Y ]
E [ X1 + ... + X n ] = E [ X1 ] + ... + E [ X n ].
254
Variance of a sum of two random variables?
(
V ( X + Y ) = E ( X + Y ) - ( µx + µy )
)
( )
= E ( X - µ x ) + (Y - µ y )
2 2
= E ( X - µ x ) + (Y - µ y ) + 2 ⋅ ( X - µ x ) (Y - µ y )
2 2
= E ( X - µ x ) + E (Y - µ y ) + 2 ⋅ E ( X - µ x ) (Y - µ y )
2
2
= V ( X ) + V (Y ) + 2 ⋅ Cov ( X,Y )
V ( X1 + ... + X n ) = V ( X1 ) + ... + V ( X n ) .
255
Sample mean of uncorrelated random variables:
1 1 n 1 σ2
( )
V Xn
n
= V ( X1 + + X n ) = 2 ⋅ ∑V [ X i ] = 2 ⋅ n ⋅ σ =
n i=1 n
2
n
σ
⇒ σX =
n
n
257
Example 5.4.1 (continued):
a) Describe the random variable
X: “player’s net winnings’’
using Bernoulli distributed random variables.
3
→ E U =
3 ⋅ E X =;
2
3 9
→ V (U ) = 3 ⋅ V ( X ) = 3 ⋅ = .
4 4 259
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
261
Let us consider a sequence of n random variables.
Assume that the random variables X1,..., Xn are
independent and identically distributed (i.i.d) with
(both finite)
E [ X i ] = µ und V ( X i ) = σ 2 .
265
Theorem: Limit Theorem of De Moivre and Laplace
Let Sn be a binomially distributed random variable with
parameters n and p.
Then its distribution function converges with increasing n
towards a normal distribution with corresponding moments:
FBi ( sn ; n, p ) → FN ( sn ; np, np (1 - p ) )
268
Example 6.3:
270
Example 6.4 (continued):
272
Representative random sample:
2. Ordinal scale:
275
3. Ratio scale:
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
278
Goal:
The task of descriptive statistics is to introduce a
number of descriptive, in most cases graphical
methods to summarize all the information about the
variables under investigation and illustrate the main
features, without distorting the picture.
279
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
281
Example 7.1.1: (radioactive decay of Americium-241)
Class
interval 10 11 12 13 14 15 16 >17
(emissions)
Numbers 123 101 74 53 23 15 9 5
285
Example 7.1.2: (‘population pyramids’, book page 22)
Remark: Histograms can also be used for variables
maesured on a nominal or ordinal scale (bar charts).
𝐹𝐹𝑛𝑛 (𝑦𝑦𝑗𝑗 �
287
𝑦𝑦𝑗𝑗
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
1 n
x=
n
∑x
i=1
i
289
Definition (measures of location)
a2) Median: (→ ordinal and ratio scale)
x n+1 , if n odd,
2
xMed =
1
x n + x n , if n even.
2 2 2 +1
290
Example 7.2.1:
Let us consider the following observations:
4, 7, 7, 7, 12, 12, 13, 16, 19, 23, 23, 97 .
mean x = 20;
mode x M = 7;
12+13
median x Med = = 12.5.
2
291
Definition (quantiles)
293
The location measures like mean, median, or modus
give only some information about the central
tendency of a distribution.
294
Definition (measures of dispersion)
c1) Range, which is the difference between the
smallest and largest observations, is the
simplest measure of dispersion:
range = xmax – xmin.
MQA=
( Q3 -Q2 ) + ( Q2 -Q1 ) IQA
= ,
2 2
where IQA = Q3-Q1 is called inter-quartile range.
295
Example 7.2.4:
n = 14 observations
range = 38-11=27
Q1 Q2 Q3
IQA
1
Q =
2
( x ( ) + x ( ) ) = 26.8
7 8
2
Q = x(
1 4)
= 18 ⇒ IQA = 13.5 ⇒ MQA = 6.75.
Q =x = 31.5
3 (11)
296
Definition (measures of dispersion)
c3) Variance and standard deviation as measures
of dispersion:
The mean of the squared distances of the
observations from the arithmetic mean
1 n
sx = ∑ ( x i - x )
2 2
n i=1
is called (empirical) variance.
The positive square root of the variance
sx = + sx2
is called (empirical) standard deviation. 297
Example 7.2.5:
Consider the observations
3, 5, 9, 9, 6, 6, 3, 7, 7, 6, 7, 6, 5, 7, 6, 9, 6, 5, 3, 5.
Let us compute the empirical variance:
( x -x ) h j ( x j -x )
2 2
j xj nj hj h jx j x j -x j
1 3 3
0.15 0.45 − 3 9 1.35
2 5 4
0.20 1 −1 1 0.20
3 6 6
0.30 1.8 0 0 0
4 7 4
0.20 1.4 1 1 0.20
5 9 3
0.15
1.35
3
9 1.35
n=20
= ∑ 1= x =6
∑ 0 Sx 2 =3.1
298
Definition
299
Example 7.2.6: (stock prices, 250 working days)
300
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
c = smallest observation x i
1.5; 3.5; 6.5; 11.50; 12.50; 14; 17; 17; 19; 20; 23.5;
32.5; 34.5; 39; 55.5; 119
( X ( ) +X ( ) )
8 9
( X ( ) +X ( ) )
4 5
( X( 13 )
+X (12 ) )
Q2 = = 18 ; Q1 = = 12 ; Q3 = = 33.5
2 2 2
⇒ IQA = 21.5 Q 3 + 1.5 ⋅ IQA = 65.75; Q1 - 1.5 ⋅ IQA < 0
1.5 ⋅ IQA = 32.25
55.5
Q3 • 119 outlier
• small relative dispersion
• not symmetric!
(more concentration on Q1/Q2)
Q2
Q1
1.5
303
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
(theoretical distribution:
normal distribution).
308
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
313
A central problem in statistics consists of the
identification of random variables of interest, the
specification of a joint distribution or a family of
possible joint distributions for the observable random
variables, and the identification of any parameters of
those distributions that are assumed unknown.
1) Point estimation:
For each parameter one gets a single value from
the sample as a result of the estimation procedure.
2) Confidence intervals:
The idea is to get some intervals of values in which
the true unknown parameters are contained with
high probability (confidence).
315
The starting point in both approaches is the definition of a
so-called estimator (or statistic).
θ n ( X1 ,…, X n )
316
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
xi 176 180 181 168 177 186 184 173 182 177
µˆ = x n = 178.4 cm.
320
Question: Is this a good estimator?
322
Example 8.1.1 (continued):
σ̂ 2 = s 2x = 25.84.
323
Example 8.1.3: (firm’s lifetime)
324
Estimators t(X1,…, Xn) are random variables and therefore
have their own distribution.
Notation:
• symbol "^" means estimator (pronounced "Hat")
• T = t(X1,…, Xn) is a random variable; its realization (called
estimate) is based on the sample observations xi (i = 1,…,
n) of the corresponding random variables in the sample.
• E[T] = µT denotes the expected value of the estimator T.
• V(T) = E ( T - μ T ) = σ T 2 denotes the variance of the
2
estimator T.
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
A) Unbiased estimators
An estimator T = t ( X1 ,…, X n ) = θˆ is an unbiased estimator
of θ if E [ T ] =μT exists and E [ T ] =μT =θ for all values of θ.
1 n
i) T = t ( X1 ,…, X n ) = X n = ∑ X i is an unbiased
n i=1
estimator for μ = E [ X ] ( i.e., μ = E [ X i ] , i = 1, , n ) :
1 n 1 n 1
E [ T ] = E X n = E ∑ X i = ∑ E [ X i ] = ⋅ n ⋅ μ = μ.
n
i=1 n i=1 n
n
ii) T = t ( X1 ,…, X n ) = S2 = 1 ∑ i
(X - X) 2
is a biased
n i=1
equals
( n-1) 2
σ and not σ 2 (for proof see next slide).
n 329
Example 8.2.1 (continued):
1 n 1 n
( ))
2
( ) (
2
E [ T ] =E S = E ∑ X i -X = E ∑ ( X i -μ ) - X-μ
2
n i=1 n i=1
1 n 2
n i=1
2
( )
= E ∑ ( X i -μ ) − n X-μ
1 n 2
n i=1
2
(
= ∑ E ( X i -μ ) − n E X-μ
)
1
=
n
⋅ n ⋅ V ( X ) - n V X
( )
1 σ 2
1 n-1
2
= n ⋅ σ - n ⋅ = ⋅ σ 2
⋅ ( n-1) = ⋅ σ 2
n n n n
=> Therefore, an unbiased estimator of σ2 is given by
n 2 1 n
T = t ( X1 ,…, X n ) =
n-1
S =
n-1
∑ i .
(X
i=1
- X) 2
330
Example 8.2.1 (continued):
iii) Consider a Bernoulli random sample Z1,…, Zn:
f z ( z ) = p (1-p ) , z ∈ {0,1} .
z 1-z
2
n p +np (1-p )
2 2
E Z(1 − Z) = E Z − E Z = p-
n 2
n-1
= p (1-p ) ≠ p (1-p )
n
332
Graphical illustration:
333
Definition (goodness properties)
lim E [ T ] = θ.
n →∞
334
Definition (goodness properties)
C) Consistent estimators
A sequence of estimators {θˆ n = t ( X1,…, Xn )}n that
converges in probability to the unknown parameter θ
being estimated, as n → ∞, is called consistent;
that is:
( )
P θˆ n − θ > ε → 0 as n → ∞, ∀ε > 0.
Notation:
P
θ̂n → θ as n → ∞.
335
Graphical illustration of consistency:
Data: firm’s lifetime example 8.1.3.
336
Theorem (practical consistency check):
and
337
Example 8.2.2: (sample mean)
2 n/2
Xb = ∑
n i =1
X 2i
2 n/2 2 n/2 2
=
E ( X b ) E( ∑ = X 2i ) ∑ ( X 2i )
E= ⋅ n / 2 ⋅ E=
( X 2i ) µ.
= n i 1= n i 1 n
340
Graphical illustration of efficiency (continued)
What about the variance? We can compute that
1 2 2 2
=V ( Xa ) = σ and V ( X b ) σ .
n n
341
Xa Xb
Graphical illustration of efficiency (regression)
342
Definition (goodness properties)
E ( T-θ ) = MSE ( θ )
2
=0
= E ( T-μT ) + ( θ-μT )
2 2
= V ( T ) + bias .
2
344
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
350
B) Least-squares method
Idea: It is generally used in linear regressions.
Suppose that the goal is to estimate the unknown mean
parameter μ.
This method implies choosing as estimator of μ the
function μ̂ LS defined as
n
μ̂LS = argmin ∑ (Xi -μ)2 ,
μ i=1
Also, each night this lion eats people; it eats i people with
probability
p (i / θ ) , θ =
∈Θ {θ=
,j
j 1, 2,3} .
The numerical values are given in the following table (see
next slide):
352
i 0 1 2 3
𝑝𝑝 𝑖𝑖 ⁄𝜃𝜃1 0.00 0.05 0.05 0.90
𝑝𝑝 𝑖𝑖 ⁄𝜃𝜃2 0.05 0.05 0.80 0.10
𝑝𝑝 𝑖𝑖 ⁄𝜃𝜃3 0.90 0.08 0.02 0.00
∑ x ∑ (1-x )
n n
∂ log !
L ( p ; x1,..., x n ) i=1 i i
= = - i=1
0
∂p pˆ 1-pˆ
( )
n n
⇔ 1 − p ⋅ ∑ xi =p ⋅ n − ∑ xi
i=1 i=1
1 n
⇒ p ML = ∑ Xi = Xn .
n i=1 356
Example 8.3.2: Let X1,…, Xn be a random sample
from a uniform continuous distribution:
1
,0 ≤ x ≤ θ
fUni ( X ) = θ
0 , else
n
1
L ( θ ; x1,..., xn ) = , if all xi ∈ [0,θ].
θ
L is monotonic decreasing in θ, but: all xi ∈ [0,θ],
that is θ ≥ xi , i=1,..., n
⇒ θˆ = max ( X ,, X ) .
ML 1 n
357
Graphical illustration of example 8.3.2:
358
Properties of maximum likelihood estimators
359
Part I: Probability Part II: Statistics
1. Probability theory: the building blocks 7. Descriptive statistics
1.1. Events and the sample space 7.1. Frequency tables, histograms, and empirical distributions
1.2. Relations of set theory 7.2. Summarizing data using numerical techniques
1.3. The concept of probability 7.3. Boxplot
1.4. Axiomatic definition of probability 7.4. Quantile-Quantile-plot
1.5. Basic theorems 7.5. Scatter diagram
1.6. Probability spaces
1.7. Conditional probability and stochastic independence 8. Estimation of unknown parameters
1.8. Law of total probability 8.1. Intuitive examples of estimators
1.9. Bayes’ theorem 8.2. Properties of estimators
8.3. Main methods to get estimators
2. Combinatorial methods
2.1. Factorials and binomial coefficients 9. Confidence intervals
2.2. Multiplication rule 9.1. The idea
2.3. Permutations 9.2. Example of a confidence interval
2.4. Combinations (mean of a distribution, large samples)
2.5. Sampling with replacement 9.3. Relation with testing hypotheses
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
361
9.1. The idea
Confidence intervals provide a method of adding
more information to an estimator when we wish to
estimate an unknown parameter θ.
We can find an interval (A,B) that we think has high
probability of containing θ. The length of such an
interval gives us an idea about how closely we can
estimate θ and how large the sampling error is.
362
Definition
Symmetric (1- α)-confidence interval:
CONF1-α ( θ ) = θ n -f n ; θ n +f n ,
where f n denotes the sampling error and is
computed in such a way that the confidence interval
contains the unknown parameter θwith a given
probability(1-α):
P θ ∈ CONF1-α ( θ ) = 1-α.
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
( )
where q α denotes the 1 − α 2 - quantile of the standard
1−
2
367
Example 9.2.1: (rent index)
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
4. Special distributions
4.1. The uniform discrete distribution
4.2. The Bernoulli distribution (discrete)
4.3. The Binomial distribution (discrete)
4.4. The Poisson distribution (discrete)
4.5. The uniform continuous distribution
4.6. The exponential distribution (continuous)
4.7. The normal distribution (continuous)
375
Exercise 1.1.1:
Give the sample space in the following cases:
1) A person is asked about her birthday.
2) K persons are asked about their birthdays.
3) Position of a locator on the unit circle.
4) Let S = {"0 times six", "2 times six"}.
Is it a sample space for the experiment of rolling
two dice?
376
Exercise 1.2.1:
1) Rolling one die:
2) A = {( x, y ) |ax + by + c = 0}
→AB=?
B = {( x, y ) |ax + by + d = 0}
A ∪ B = A ∩ B and A ∩ B = A ∪ B.
377
Exercise 1.2.1 (continued):
378
Exercise 1.2.1 (continued):
Write the following events as subsets of S:
A=?
B=?
C=?
A=?
C=?
379
Exercise 1.2.1 (continued):
B∩C = ?
B\C =?
A ∪C = ?
A ∩B ∩C = ?
380
Exercise 1.3.1:
381
Exercise 1.5.1:
382
Exercise 1.5.2: (sick notes)
P(Ei) 0.751 0.1 0.063 0.061 0.011 0.008 0.005 0.001 Σ=1
Compute:
→ P(“X ill”) = ?
→ P(“Y ill”) = ?
→ P(“X and Y ill”) = ?
→ P(“X or Y ill”) = ?
383
Exercise 1.6.1:
Compute the probability of the following two events:
a) A: “Rolling four dice we get at least one six’’
b) B: “Rolling two dice 24 times we get at least one
twelve as the sum of the numbers’’
384
Exercise 1.6.2:
385
Exercise 1.7.1: (sick notes, see exercise 1.5.2)
386
Exercise 1.7.2:
Two independent elevators A and B, identical from
both a technical and a functional point of view, are
located in an office building.
The probability that the elevator A (or B) at a given
point in time is on the ground floor equals 0.2.
387
Exercise 1.7.2 (continued):
389
Exercise 1.8.1: (draw of the ‘Zusatzzahl’ in Lotto)
A: “Zusatzzahl 1 is drawn’’?
390
Exercise 1.8.2: (supplier with differences in quality)
An automaker equips his vehicles with air conditioning
systems that he gets from three different suppliers.
P (A M) = ?
P (B M) = ?
P (C M) = ?
392
Exercise 1.9.2: (urn)
394
Exercise 3.1.1:
S = {(i, j) | i , j ∈ {1,...,6}}
X = "sum of the numbers": ( i, j ) → i+j
W = {2,...,12}
395
Exercise 3.1.2:
A machine produces defective items with probability 5%.
Then: W={0,1,2,3,4}.
396
Exercise 3.2.1:
K for x = 0
2K for x = 1
f x ( x ) = 3K for x = 2
5K for x = 3
0 else.
397
Exercise 3.2.1 (continued):
→ P (1 < X ≤ 3 ) = ?
P ( X > 1) = ?
P ( X = 1) = ?
398
Exercise 3.2.2: (revenue under uncertain conditions)
399
Exercise 3.3.1:
Let us consider a random variable X with density
function f given by
2 1
3x , 0 ≤ x ≤ 2
f ( x ) = 3x 1
2 , 2 <x≤c
0 , else.
a) Determine the constant c such that the function f is
a density function. Sketch the density function.
b) Compute the distribution function of X.
400
Exercise 3.4.1:
2 1
3x , 0 ≤ x ≤ 2
f (x) = 3 1
2 x , 2 < x ≤ c
0 , else.
E [ X] = ?
401
2. Waiting time at the 'S-Bahn' station (continued)
403
Exercise 3.5.2:
Compute the expected value and the variance of the
random variable
Y = 3X + 2,
where X has the probability function
X 1 2 5
f ( x ) 0.2 0.3 0.5
404
Exercise 3.5.3: (defective piping)
A piping is made of 20 segments. Given that the
ouflow quantity is smaller than the inflow quantity,
there must be a leak somewhere.
Let us assume that there is exactly one leak and
that it is located in each one of the segments with
equal probability 1/20.
We would like to find the segment in which the
leak is located with the smallest possible number
of inspections (that is, measuring the flow rate at
each segment’s borders).
405
Exercise 3.5.3 (continued):
406
Exercise 4.3.1: (quality check)
In the production of high-quality drinking glasses the
percentage of defective items equals 20%.
In the course of a quality check we take randomly
four drinking glasses with replacement.
X: ‘‘# of defective glasses in the sample’’
Y: ‘‘# of flawless glasses in the sample’’
Compute the probability that:
(1) exactly one glass in the sample is defective;
(2) at least two glasses in the sample are defective;
(3) exactly one glass in the sample is flawless.
Compute E[X], E[Y], V(X), V(Y).
407
Exercise 4.4.1: (clients at the bank counter)
Clients come to a given bank counter at some unpredictable
point in time: in the morning (8-12) on average 12 clients per
hour and in the afternoon (14-16) on average 10 clients per
hour.
Assume that clients are coming independently to each other
whether it is morning or afternoon. Compute the probability…
1) that on a given day between 09.00h and 09.15h no client
comes to the bank counter;
2) that on a given day between 15.00h and 15.15h no client
comes to the bank counter;
3) that on a given day between 15.30h and 16.00h more than
6 clients show up at the bank counter.
408
Exercise 4.6.1: (clients at the bank counter)
(see exercise 4.4.1, continued)
Let X ~ fN (x ; 2,16)
→ P ( X ≤ 0) =?
→ P ( X ≤ 2) =?
→ P ( X ≤ q) =
0.25;
→ P ( X ≤ q) =
0.75.
410
Exercise 5.1.1:
411
Exercise 5.4.1:
412
Exercise 5.4.2:
416
Exercise 6.3:
417
Exercise 7.4.1:
418
Exercise 8.2.1: (estimation of λ in a Poisson distribution)
n 2
T1 = X n and T2 = S .
n-1
i) Are T1 and T2 unbiased for λ ?
ii) Is T1 consistent for λ ?
iii)Is T1 (relative) efficient with respect to T2?
419
Exercise 8.2.2:
Let
1 n 1 n
X= ∑ Xi and X = 2X1+∑ Xi
n i=1 n+1 i=2
be two competing estimators for the mean
parameter E[X] = μ of the population distribution.
Assume that V(X) = 𝜎𝜎 2 exists.
421
Exercise 9.2.1:
423