Let X₁,..., Xn be a random sample of size n > 1 from the N(μ, o²) distribution, where μ€ (-∞, +∞) and o² >0 are two unknown parameters. Since the value of o² depends on the value of μ, we cannot estimate o² without estimating and we cannot expect to estimate two unknown parameters unless we have at least two data points - hence the requirement that n > 1. Given a data sample (x₁,...,xn), the likelihood function of (μ, o²) is defined as n 1 ² 1 C(14,0³²) = (2705²) exp ( - 203² Σ(²; − µ)²). - i=1 You can freely use the fact that the random variable Σ(x − x)2 - i=1 has the x² distribution with (n − 1) degrees of freedom x= (a) Without using any tools from multivariable calculus, show that (x, s²), where W = n (b) Calculate E(S²), where n i=1 n 1 n ₁ and ²: = 1.n ·Σ(x₁ - x)², i=1 is the global maximizer of the likelihood function L. n (X; – X)² with X = 1 n n ΣX₁. i=1

MATLAB: An Introduction with Applications
6th Edition
ISBN:9781119256830
Author:Amos Gilat
Publisher:Amos Gilat
Chapter1: Starting With Matlab
Section: Chapter Questions
Problem 1P
Question

thank you .

•••,
Let X₁, ‚ X₁ be a random sample of size n > 1 from the N(μ, o²) distribution, where
μ€ (-∞, +∞) and o² >0 are two unknown parameters. Since the value of ²
depends on the value of µ, we cannot estimate o² without estimating µ and we cannot
expect to estimate two unknown parameters unless we have at least two data points -
hence the requirement that n > 1.
Given a data sample (x₁,...,xn), the likelihood function of (µ, o²) is defined as
1
C(1₁0²) = (2-10²2) ²
p(-20-3( (14) ²).
20².
You can freely use the fact that the random variable
n
120X
02
i=1
W =
has the x² distribution with (n − 1) degrees of freedom
(b) Calculate E(S²), where
exp
n
(a) Without using any tools from multivariable calculus, show that (x, s²), where
1
x = - x; and $²
n
i=1
(X₁ – X)²
-
-
=
1
n
n
i=1
is the global maximizer of the likelihood function L.
i=1
(X; - X)² with X
(x₁ - x)²,
==
n
n
ΣX₁.
i=1
Transcribed Image Text:•••, Let X₁, ‚ X₁ be a random sample of size n > 1 from the N(μ, o²) distribution, where μ€ (-∞, +∞) and o² >0 are two unknown parameters. Since the value of ² depends on the value of µ, we cannot estimate o² without estimating µ and we cannot expect to estimate two unknown parameters unless we have at least two data points - hence the requirement that n > 1. Given a data sample (x₁,...,xn), the likelihood function of (µ, o²) is defined as 1 C(1₁0²) = (2-10²2) ² p(-20-3( (14) ²). 20². You can freely use the fact that the random variable n 120X 02 i=1 W = has the x² distribution with (n − 1) degrees of freedom (b) Calculate E(S²), where exp n (a) Without using any tools from multivariable calculus, show that (x, s²), where 1 x = - x; and $² n i=1 (X₁ – X)² - - = 1 n n i=1 is the global maximizer of the likelihood function L. i=1 (X; - X)² with X (x₁ - x)², == n n ΣX₁. i=1
Before moving on to the remaining parts of the problem, let us set up some notation.
Consider the class of estimators of o² of the form cQ, where c> 0 is a constant and
Q
82
n
Σ
i=1
Show that
We know from part (a) that c = 1/n leads to the maximum likelihood estimator S² of
o2. We know from part (b) that c = 1/(n-1) leads to an unbiased estimator
i – X)².
(Xi - .
n - 1
n
1
i=1
(X; – X)²
of o2, which is actually the UMVUE of o².
(c) Calculate the MSE of S2; remember that S2 is an unbiased estimator of o², so the
MSE of S2 equals the variance of S².
(d) Recall that given any estimator of a parameter 0, we define
Bias () = E() - 0.
MSEe ( Ꮎ
(@) = Vare (@) + [Bias (@)]².
(e) Calculate the MSE of G2 and show that the MSE of G2 is smaller than the MSE of
S².
(f) Let f(c) denote the MSE of the estimator cQ. Of course the value of f(c) depends on
o2, but in any given problem o² is fixed (though unknown), so f is a function defined on
(0, ∞). Find c*> 0 such that c* is the global minimizer of f.
Transcribed Image Text:Before moving on to the remaining parts of the problem, let us set up some notation. Consider the class of estimators of o² of the form cQ, where c> 0 is a constant and Q 82 n Σ i=1 Show that We know from part (a) that c = 1/n leads to the maximum likelihood estimator S² of o2. We know from part (b) that c = 1/(n-1) leads to an unbiased estimator i – X)². (Xi - . n - 1 n 1 i=1 (X; – X)² of o2, which is actually the UMVUE of o². (c) Calculate the MSE of S2; remember that S2 is an unbiased estimator of o², so the MSE of S2 equals the variance of S². (d) Recall that given any estimator of a parameter 0, we define Bias () = E() - 0. MSEe ( Ꮎ (@) = Vare (@) + [Bias (@)]². (e) Calculate the MSE of G2 and show that the MSE of G2 is smaller than the MSE of S². (f) Let f(c) denote the MSE of the estimator cQ. Of course the value of f(c) depends on o2, but in any given problem o² is fixed (though unknown), so f is a function defined on (0, ∞). Find c*> 0 such that c* is the global minimizer of f.
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 8 steps with 77 images

Blurred answer
Recommended textbooks for you
MATLAB: An Introduction with Applications
MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc
Probability and Statistics for Engineering and th…
Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning
Statistics for The Behavioral Sciences (MindTap C…
Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning
Elementary Statistics: Picturing the World (7th E…
Elementary Statistics: Picturing the World (7th E…
Statistics
ISBN:
9780134683416
Author:
Ron Larson, Betsy Farber
Publisher:
PEARSON
The Basic Practice of Statistics
The Basic Practice of Statistics
Statistics
ISBN:
9781319042578
Author:
David S. Moore, William I. Notz, Michael A. Fligner
Publisher:
W. H. Freeman
Introduction to the Practice of Statistics
Introduction to the Practice of Statistics
Statistics
ISBN:
9781319013387
Author:
David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:
W. H. Freeman