Tutorial 4

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

TUTORIAL 4 SOLUTIONS

#8.10.12 Suppose that you had to choose


either the method of moments estimates or
the maximum likelihood estimates in Exam-
ple C of Section 8.4 and C of Section 8.5.
Which would you choose and why?
Solution We shall focus on Example C
of Section 8.4 as the same reasons apply to
Example C of Section 8.5.
The method of moments estimates for the
parameters λ and α are available in closed-
form:

λ̂ = 2 ,
σ̂
X̄ 2
α̂ = 2 .
σ̂
This is typical for method of moments esti-
mates and is an attractive property of method
of moments estimates.
1
On the other hand, the mle’s for λ and α
are not available in closed-form.
However it can be shown by simulations
(and also theoretically) that the sampling
distributions of the two mle’s are substan-
tially less dispersed than those of the method
of moments estimates.
This means that the two mle’s are gener-
ally more accurate than the method of mo-
ments estimates for λ and α.

2
#8.10.21 Suppose that X1, . . . , Xn are
i.i.d. with density function
f (x|θ) = e−(x−θ) , x ≥ θ,
and f (x|θ) = 0 otherwise.
a. Find the method of moments estimate of
θ.
b. Find the mle of θ. (Hint: Be careful,
don’t differentiate before thinking. For
what values of θ is the likelihood posi-
tive?)
c. Find a sufficient statistic for θ.

Solution
a. Let µ = E(X1). Then
Z ∞
µ= xe−(x−θ) dx
Zθ ∞
= (x + θ)e−xdx
0
= θ + 1.
3
Thus θ = µ − 1 and the method of moments
estimate of θ is
θ̂ = µ̂1 − 1 = X̄ − 1.

b. Note that the density function f (x|θ)


is not differentiable at x = θ.
Also f (x|θ) is strictly decreasing for x ≥ θ
and equals 0 for x < θ.
The likelihood function is
n
Y
lik(θ) = e−(xi−θ)
i=1
if min(x1, . . . , xn) ≥ θ, and lik(θ) = 0 oth-
erwise.
Hence the maximum of lik(θ) occurs at
θ = min(x1, . . . , xn) and we conclude that
the mle of θ is
θ̃ = min(X1, . . . , Xn).

4
c. Let I{xi ≥ θ} denote the indicator
function of the event {xi ≥ θ}.
I.e., I{xi ≥ θ} = 1 if xi ≥ θ and I{xi ≥
θ} = 0 if xi < θ.
Then the joint pdf of X is given by
f (x|θ)
Yn
= e−(xi−θ) I{xi ≥ θ}
i=1P
n
= e − i=1 xi enθ I{min(x , . . . , xn ) ≥ θ}
1
= g(t, θ)h(x),
where
t = min(x1, . . . , xn),
g(t, θ) = enθP
I{min(x1, . . . , xn) ≥ θ},
n
h(x) = e − i=1 xi .

It follows from the factorization theorem that


T = min(X1, . . . , Xn) is sufficient for θ.

5
#8.10.31 George spins a coin three times
and observes no heads. He then gives the
coin to Hilary. She spins it until the first
head occurs, and ends up spinning it four
times total. Let θ denote the probability
that the coin comes up heads.
a. What is the likelihood of θ?
b. What is the MLE of θ?

Solution
a. The likelihood function is
lik(θ) = P (heads)[P (tails)]6
= θ(1 − θ)6.

6
b. The loglikelihood is
l(θ) = log(θ) + 6 log(1 − θ).
Differentiating l(θ) wrt θ and then solving
for l0(θ) = 0, we get
0 1 6
(1) l (θ) = − = 0.
θ 1−θ
The solution for (1) is θ = 1/7. Hence we
conclude that the MLE of θ is
θ̂ = 1/7.

You might also like