STAT 709 Midterm 2024

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

STAT 709: Midterm Exam

Instructor: Yiqiao Zhong, TA: Jingyang Lyu


Fall 2024

Date: Oct. 28, 2024, 11:00 am—12:15 pm

1. (40 points, 8 points for each subproblem) Suppose that X ∼ N (0, 1) is a standard normal random variable.
Denote the cumulative distribution function (CDF) of X by Φ(x). Let Y be a “censored” random variable
of X, defined by 
a,
 if X ≥ a
Y = X, if |X| < a, (1)

−a, if X ≤ −a

where a > 0 is a given scalar. Let δx denote the delta measure defined as
(
0, if x ∈
/A
δx (A) = for any Borel set A
1, if x ∈ A

(i) Find the CDF of Y using Φ.


(ii) Find the CDF associated with the measure δa .
(iii) Let µX and µY be induced measure of X and Y respectively. Express µY in terms of µX and delta
measures.
(iv) Does µY have (Lebesgue) probability density function, or probability mass function? Briefly explain
why.
(v) Let sign(x) be the sign function defined by

1,
 if x > 0
sign(x) = 0, if x = 0

−1, if x < 0

Show that |Y | and sign(Y ) are independent.

2. (40 points, 8 points for each subproblem) Suppose that we have i.i.d. k-dimensional observations
X1 , X2 , . . . , Xn ∼ N (0, Σ) where Σ = diag{σ12 , . . . , σk2 } is an unknown diagonal matrix with σ12 , . . . , σk2 > 0.
We are interested in estimating Σ using the n observations.
(i) Do we have a parametric distribution or nonparametric distribution?
(ii) Write down the likelihood function for estimating Σ.
(iii) Find the MLE Σ b for Σ.
(iv) What is the bias of the estimator Σ? b
(v) Suppose that we are now interested in estimating Σ1/2 = diag{σ1 , . . . , σk }. Denote Σ b 1/2 = diag{b bk }.
σ1 , . . . , σ
Here the square root means taking the square root of each element in the diagonal matrix. Use the Delta
method to derive √  d
n (b bk ) − (σ1 , . . . , σk ) −→ N (0, V )
σ1 , . . . , σ

1
and determine an expression for the matrix V ∈ Rk×k .

3. (20 points, 10 points for each subproblem) Let G ∼ N (0, 1) be a standard normal variable, and
f : R → R be a continuously differentiable function.
(i) Prove Stein’s identity: E[f 0 (G) − Gf (G)] = 0 where f and f 0 are bounded.
(ii) Let (Xn )n≥1 be a sequence of random variables with supn E[Xn2 ] ≤ C < ∞. Stein’s continuity theorem
is a useful result, which states that the following two statements are equivalent.
(a) E[f 0 (Xn ) − Xn f (Xn )] converges to zero whenever f : R → R is continuously differentiable with f, f 0
bounded.
(b) Xn converges in distribution to G.

Please prove that (b) implies (a).

You might also like