Consider the linear regression model y = X3 + u with iid random variables and assume u~ N(0,0²In). Conditional on X, show that the MLE estimator of 3 coincides with the OLS estimator of 3. Compare the Gauss-Markov Theorem to the Cramer-Rao Theorem in this case. Conditional on X, y~ N(XB, 0² In) and hence f(y X, 3,0²) = From here obtain the log-likelihood as n Π(2πσ?)-1/2 exp (2πо²)-¹/2 exp i=1 - l(y|X, 3,0²) = (n/2) In (2πo²) - (y — Xß)' (y - XB) . 20² - (2) Since the only term in (2) that depends on ß is the numerator of the second RHS term, maximizing l(y X, B, 0²) w.r.t 3 is equivalent to maximizing - (y - XB)' (y - XB) w.r.t. ß in that both yield the same argmax. At the same time, (Yi-x₁3)² 20² (y - XB)' (y - Xß) 20² - (y - XB)' (y - XB) = -u'u = n -Σu² i=1 (3) which is the negative of the sum of square residuals criterion function of OLS. Hence maximizing (y|X, B, 0²) in (2) to obtain BML yields the same result as minimizing Σuin (3) to obtain BOLS. The GM and CR Theorems also coincide in this case, since we are applying MLE to a linear model.

MATLAB: An Introduction with Applications
6th Edition
ISBN:9781119256830
Author:Amos Gilat
Publisher:Amos Gilat
Chapter1: Starting With Matlab
Section: Chapter Questions
Problem 1P
icon
Related questions
Question
Consider the linear regression model
y = X3 + u
with iid random variables and assume u ~ N(0,0²In). Conditional on X, show that the MLE estimator of
3 coincides with the OLS estimator of 3. Compare the Gauss-Markov Theorem to the Cramer-Rao Theorem
in this case.
Conditional on X, y ~ N(XB, o² In) and hence
f(y|X, 3,0²)
=
=
From here obtain the log-likelihood as
n
(2π0²)-¹/² exp
i=1
(2πо²)-¹/² exp
(Yi - X₁3)²
20²
(y - XB)' (y - X³)
20²
l(y|X, 3,0²) = − (n/2) ln(2ño²) – (y — Xß)' (y – Xß)
20²
(2)
Since the only term in (2) that depends on ß is the numerator of the second RHS term, maximizing
l(y X, B, 0²) w.r.t 3 is equivalent to maximizing - (y - XB)' (y - XB) w.r.t. ß in that both yield the
same argmax. At the same time,
- (y - XB)' (y - X³)
=
-u'u
n
-Σu²
i=1
(3)
which is the negative of the sum of square residuals criterion function of OLS. Hence maximizing (y|X, B, 0²)
in (2) to obtain BML yields the same result as minimizing ₁ u²in (3) to obtain BOLS. The GM and
CR Theorems also coincide in this case, since we are applying MLE to a linear model.
Transcribed Image Text:Consider the linear regression model y = X3 + u with iid random variables and assume u ~ N(0,0²In). Conditional on X, show that the MLE estimator of 3 coincides with the OLS estimator of 3. Compare the Gauss-Markov Theorem to the Cramer-Rao Theorem in this case. Conditional on X, y ~ N(XB, o² In) and hence f(y|X, 3,0²) = = From here obtain the log-likelihood as n (2π0²)-¹/² exp i=1 (2πо²)-¹/² exp (Yi - X₁3)² 20² (y - XB)' (y - X³) 20² l(y|X, 3,0²) = − (n/2) ln(2ño²) – (y — Xß)' (y – Xß) 20² (2) Since the only term in (2) that depends on ß is the numerator of the second RHS term, maximizing l(y X, B, 0²) w.r.t 3 is equivalent to maximizing - (y - XB)' (y - XB) w.r.t. ß in that both yield the same argmax. At the same time, - (y - XB)' (y - X³) = -u'u n -Σu² i=1 (3) which is the negative of the sum of square residuals criterion function of OLS. Hence maximizing (y|X, B, 0²) in (2) to obtain BML yields the same result as minimizing ₁ u²in (3) to obtain BOLS. The GM and CR Theorems also coincide in this case, since we are applying MLE to a linear model.
Expert Solution
steps

Step by step

Solved in 3 steps with 44 images

Blurred answer
Similar questions
Recommended textbooks for you
MATLAB: An Introduction with Applications
MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc
Probability and Statistics for Engineering and th…
Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning
Statistics for The Behavioral Sciences (MindTap C…
Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning
Elementary Statistics: Picturing the World (7th E…
Elementary Statistics: Picturing the World (7th E…
Statistics
ISBN:
9780134683416
Author:
Ron Larson, Betsy Farber
Publisher:
PEARSON
The Basic Practice of Statistics
The Basic Practice of Statistics
Statistics
ISBN:
9781319042578
Author:
David S. Moore, William I. Notz, Michael A. Fligner
Publisher:
W. H. Freeman
Introduction to the Practice of Statistics
Introduction to the Practice of Statistics
Statistics
ISBN:
9781319013387
Author:
David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:
W. H. Freeman