6.31 Boos and Hughes-Oliver (1998) detail a number of instances where application of Basu's Theorem can simplify calculations. Here are a few. (a) Let X1,..., Xn be iid n(µ, o²), where o² is known. (i) Show that X is complete sufficient for , and S2 is ancillary. Hence by Basu's Theorem, X and S? are independent. (ii) Show that this independence carries over even if o is unknown, as knowledge of o? has no bearing on the distributions. (Compare this proof to the more involved Theorem 5.3.1(a).) (b) A Monte Carlo swindle is a technique for improving variance estimates. Suppose that X1,..., X, are iid n(u, o²) and that we want to compute the variance of the median, M. (i) Apply Basu's Theorem to show that Var(M) = Var(M-X)+Var(X); thus we only have to simulate the Var(M-X) piece of Var(M) (since Var(X) = o²/n).

A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
icon
Related questions
Question
i need the answer quickly
6.31 Boos and Hughes-Oliver (1998) detail a number of instances where application of
Basu's Theorem can simplify calculations. Here are a few.
(a) Let X1,..., Xn be iid n(u, o?), where o? is known.
(i) Show that X is complete sufficient for u, and S2 is ancillary. Hence by Basu's
Theorem, X and S? are independent.
(ii) Show that this independence carries over even if o? is unknown, as knowledge
of o? has no bearing on the distributions. (Compare this proof to the more
involved Theorem 5.3.1(a).)
(b) A Monte Carlo swindle is a technique for improving variance estimates. Suppose
that X1,..., X, are iid n(u, o2) and that we want to compute the variance of the
median, M.
(i) Apply Basu's Theorem to show that Var(M) = Var(M- X)+Var(X); thus we
only have to simulate the Var(M-X) piece of Var(M) (since Var(X) = o² /n).
(ii) Show that the swindle estimate is more precise by showing that the variance of
M is approximately 2[Var(M)/(N – 1) and that of M - X is approximately
2[Var(M – X)²/(N – 1), where N is the number of Monte Carlo samples.
(c) (i) If X/Y and Y are independent random variables, show that
k
E(X*)
E(Y*)*
(ii) Use this result and Basu's Theorem to show that if X1,..., Xn are iid
gamma(a, B), where a is known, then for T =
E(X(1)
E (Xp») 7) = E ( 717)
=T
ET
Transcribed Image Text:6.31 Boos and Hughes-Oliver (1998) detail a number of instances where application of Basu's Theorem can simplify calculations. Here are a few. (a) Let X1,..., Xn be iid n(u, o?), where o? is known. (i) Show that X is complete sufficient for u, and S2 is ancillary. Hence by Basu's Theorem, X and S? are independent. (ii) Show that this independence carries over even if o? is unknown, as knowledge of o? has no bearing on the distributions. (Compare this proof to the more involved Theorem 5.3.1(a).) (b) A Monte Carlo swindle is a technique for improving variance estimates. Suppose that X1,..., X, are iid n(u, o2) and that we want to compute the variance of the median, M. (i) Apply Basu's Theorem to show that Var(M) = Var(M- X)+Var(X); thus we only have to simulate the Var(M-X) piece of Var(M) (since Var(X) = o² /n). (ii) Show that the swindle estimate is more precise by showing that the variance of M is approximately 2[Var(M)/(N – 1) and that of M - X is approximately 2[Var(M – X)²/(N – 1), where N is the number of Monte Carlo samples. (c) (i) If X/Y and Y are independent random variables, show that k E(X*) E(Y*)* (ii) Use this result and Basu's Theorem to show that if X1,..., Xn are iid gamma(a, B), where a is known, then for T = E(X(1) E (Xp») 7) = E ( 717) =T ET
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 4 steps

Blurred answer