Let X1,…,Xn be a random sample from a uniform distribution on the interval (θ−1,θ+1), where −∞<θ<∞.
Find the method of moment estimator for θ.
Is your estimator in part 1 an unbiased estimator of θ? Justify your answer.
Given the following n=5 observations of X, give a point estimate of θ.
6.617.706.988.367.26
Solution.
The mean of the uniform distribution is
E[Xi]=θ,
so the method of moment estimate for θ is given by x=θ. Thus, the estimator is
θ=X.
Yes. By linearity of expectation,
E[θ]=E[n1i=1∑nXi]=n1i=1∑nE[Xi]=θ.
We have
θ=x=7.382.
Problem 2
The heart rate (bpm) of 10 athletes is measured. The data is given below. Assuming that the distribution from which the data was drawn is normally distributed, find a 95% confidence interval for the mean heart rate.
38544236524449506250
Solution.
Assume that the data was pulled from N(μ,σ2). Since the population standard deviation is unknown, we need to use a t-distribution. Our test statistic is
T=S/nX−μ∼t(n−1).
Here, S is the sample standard deviation and t(n−1) is the t-distribution with n−1 degrees of freedom. The endpoints for a 100(1−α)% confidence interval are given by
x±t2α(n−1)ns,
where t2α(n−1) is the 100(1−2α)th percentile of a t(n−1) distribution. For our problem, the parameters are
xsα=47.7,≈7.8323,n=0.05.=10,
From the textbook's t-table,
t0.025(6)=2.262,
so the endpoints are
47.7±2.262⋅107.8323≈47.7±5.6.
Problem 3
Let x1,…,xn be an observed sample drawn independently from an exponential distribution, Exp(λ). Assume that the prior on λ is characterized by a Gamma distribution, Γ(α,β), with pdf
π(λ)=Γ(α)βαλα−1e−βλ,0<x<∞.
Note that the pdf of an exponential distribution is given by
f(x)=λe−λx,0<x<∞.
Find the posterior distribution for λ, π(λ∣x).
Calculate the mean of the posterior distribution.
Solution.
The starting point for these types of problems is
posterior∝likelihood×prior.
Here, ∝ means that the two things only differ by a constant factor. Here, constant means anything that doesn't depend on λ. We don't lose any information about the distribution when dropping factors without λ in them because we can always figure out the normalization constant by integrating with respect to λ. Actually, we don't even need the normalization constant to figure out the distribution.
The likelihood is given by
f(x1,…,xn∣λ)=i=1∏nλe−λxi=λne−λnx.
The prior is
π(λ)=Γ(α)βαλα−1e−βλ∝λα−1e−βλ.
Putting everything together,
π(λ∣x)∝λne−λnxλα−1e−βλ∝λα+n−1e−λ(β+nx).
By comparing to our known distributions, we see that the posterior distribution is
Γ(α+n,β+nx).
The mean of a Γ(α,β) distribution (with the notation in the problem) is βα, so the mean of the posterior distribution is
β+nxα+n.
Problem 4
Assume y1,…,yn are independent observations drawn from N(βxi,σ2). The xi's and σ2 are known constants, and β is an unknown parameter which has prior distribution N(β0,τ2), where β0 and τ2 are known constants. Derive the posterior distribution of β.
Solution.
We can use the same approach as in Problem 3. Here, constants are any factors that don't have a β in them. The likelihood is