1 The two variable regression. For the regression model y = a + Bx +ɛ (a) Show that the least squares normal equations imply Eiei = 0 and Exej = 0. (b) Show that the solution for the constant term is a = ỹ - bx. (c) Show that the solution for b is b [E (x - x)(yi – y)]/[E# (x¡ – x)²]. %3D

MATLAB: An Introduction with Applications
6th Edition
ISBN:9781119256830
Author:Amos Gilat
Publisher:Amos Gilat
Chapter1: Starting With Matlab
Section: Chapter Questions
Problem 1P
icon
Related questions
Question
100%
1 The two variable regression. For the regression model y = a + Bx+ɛ
(a) Show that the least squares normal equations imply Ee = 0 and E;x;e¡ = 0.
(b) Show that the solution for the constant term is a = ỹ - bx.
(c) Show that the solution for b is b = [E (x – x)(y - y)]/[E* (x; – x)²].
2 Change in the sum of squares. Suppose that b is the least squares coefficient vector in
the regression of y on X and that c is any other K x 1 vector. Prove that the
difference in the two sums of squared residuals is
(у — Хc)' (у — Хс) - (у-Xb) (у - Xb) 3D (с - b)'X'X(с — b)
Prove that this difference is positive.
3
(Difficult) Adding an observation. A data set consists of n observations on X, and yn.
The least squares estimator based on these n observations is b, = (X,Xn) X,'yn-
%3D
Another observation, x, and y,, becomes available. Prove that the least squares estimator
computed using this additional observation is
bns = bn +
-(X„Xn)¯1xg(Ys – x,bn).
_("x"x);x+1
4 Prove the theorem of Change in the Sum of Squares When a Variable is Added to a
Regression.
Transcribed Image Text:1 The two variable regression. For the regression model y = a + Bx+ɛ (a) Show that the least squares normal equations imply Ee = 0 and E;x;e¡ = 0. (b) Show that the solution for the constant term is a = ỹ - bx. (c) Show that the solution for b is b = [E (x – x)(y - y)]/[E* (x; – x)²]. 2 Change in the sum of squares. Suppose that b is the least squares coefficient vector in the regression of y on X and that c is any other K x 1 vector. Prove that the difference in the two sums of squared residuals is (у — Хc)' (у — Хс) - (у-Xb) (у - Xb) 3D (с - b)'X'X(с — b) Prove that this difference is positive. 3 (Difficult) Adding an observation. A data set consists of n observations on X, and yn. The least squares estimator based on these n observations is b, = (X,Xn) X,'yn- %3D Another observation, x, and y,, becomes available. Prove that the least squares estimator computed using this additional observation is bns = bn + -(X„Xn)¯1xg(Ys – x,bn). _("x"x);x+1 4 Prove the theorem of Change in the Sum of Squares When a Variable is Added to a Regression.
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 3 steps with 3 images

Blurred answer
Similar questions
Recommended textbooks for you
MATLAB: An Introduction with Applications
MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc
Probability and Statistics for Engineering and th…
Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning
Statistics for The Behavioral Sciences (MindTap C…
Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning
Elementary Statistics: Picturing the World (7th E…
Elementary Statistics: Picturing the World (7th E…
Statistics
ISBN:
9780134683416
Author:
Ron Larson, Betsy Farber
Publisher:
PEARSON
The Basic Practice of Statistics
The Basic Practice of Statistics
Statistics
ISBN:
9781319042578
Author:
David S. Moore, William I. Notz, Michael A. Fligner
Publisher:
W. H. Freeman
Introduction to the Practice of Statistics
Introduction to the Practice of Statistics
Statistics
ISBN:
9781319013387
Author:
David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:
W. H. Freeman