1. Consider a random sample {xi, yi}=1 from the data generating process y₁ = x;ß+ei, where E(ei|ri) = 0. With this information, answer the following (you may use matrix notation if desired): (a) Defining the sum of squared errors as SSE (B) = 1 (y₁ - 3)², take the first-order condition to solve for the estimator for 3. (b) Exploiting the objective function from part (a), show that this results in a minimum (i.e., second derivative). (c) Show that the estimator from part (a) is an unbiased estimator of 3. (d) Suppose we have a heteroskedastic error, derive the conditional variance of the estimator from part (a). (e) Suppose we have a homoskedastic error, simplify your answer from part (d).

MATLAB: An Introduction with Applications
6th Edition
ISBN:9781119256830
Author:Amos Gilat
Publisher:Amos Gilat
Chapter1: Starting With Matlab
Section: Chapter Questions
Problem 1P
icon
Related questions
Question
1. Consider a random sample {xi, yi}-1 from the data generating process y₁ = xß+ei, where E(ei|xi) =
0. With this information, answer the following (you may use matrix notation if desired):
(a) Defining the sum of squared errors as SSE(B) = -1(y₁ - 3)², take the first-order condition
to solve for the estimator for 3.
(b) Exploiting the objective function from part (a), show that this results in a minimum (i.e., second
derivative).
(c) Show that the estimator from part (a) is an unbiased estimator of 3.
(d) Suppose we have a heteroskedastic error, derive the conditional variance of the estimator from
part (a).
(e) Suppose we have a homoskedastic error, simplify your answer from part (d).
Transcribed Image Text:1. Consider a random sample {xi, yi}-1 from the data generating process y₁ = xß+ei, where E(ei|xi) = 0. With this information, answer the following (you may use matrix notation if desired): (a) Defining the sum of squared errors as SSE(B) = -1(y₁ - 3)², take the first-order condition to solve for the estimator for 3. (b) Exploiting the objective function from part (a), show that this results in a minimum (i.e., second derivative). (c) Show that the estimator from part (a) is an unbiased estimator of 3. (d) Suppose we have a heteroskedastic error, derive the conditional variance of the estimator from part (a). (e) Suppose we have a homoskedastic error, simplify your answer from part (d).
Expert Solution
steps

Step by step

Solved in 4 steps

Blurred answer
Similar questions
Recommended textbooks for you
MATLAB: An Introduction with Applications
MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc
Probability and Statistics for Engineering and th…
Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning
Statistics for The Behavioral Sciences (MindTap C…
Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning
Elementary Statistics: Picturing the World (7th E…
Elementary Statistics: Picturing the World (7th E…
Statistics
ISBN:
9780134683416
Author:
Ron Larson, Betsy Farber
Publisher:
PEARSON
The Basic Practice of Statistics
The Basic Practice of Statistics
Statistics
ISBN:
9781319042578
Author:
David S. Moore, William I. Notz, Michael A. Fligner
Publisher:
W. H. Freeman
Introduction to the Practice of Statistics
Introduction to the Practice of Statistics
Statistics
ISBN:
9781319013387
Author:
David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:
W. H. Freeman