nsider the multiple regression model for n y-data yı, ..., Yn, (n is sample siz y = X,B1 + X2B2 + ɛ here y = (y1, ..., yn)', X1 and X2 are random except the intercept term (i.e., th ector of 1) included in X1. Conditional on X1 and X2, the random error vecto is jointly normal with zero expectation and variance-covariance matrix V, hich does not depend on X1 and X2. V is not a diagonal matrix (i.e., some if-diagonal elements are nonzero). B1 and B2 are vectors of two different ets of regression coefficients; B1 has two regression coefficients and B2 has our regression coefficients. B = (B1 ,B½)'; that is, B is a column vector of x regression coefficients. V is completely known (i.e., the values of all elements of V are given). Let W be a matrix of k rows (k > 1) and four columns of given real numbers. Of interest are the hypotheses Họ: WB2 = 0 versus H1: WB2± 0. %3D i) Is this null hypothesis always testable? Why or why not? [5 points] ii) Consider the case that this null hypothesis is testable. Construct a statistical test and its reiection region for Ho. [10 pointsl
nsider the multiple regression model for n y-data yı, ..., Yn, (n is sample siz y = X,B1 + X2B2 + ɛ here y = (y1, ..., yn)', X1 and X2 are random except the intercept term (i.e., th ector of 1) included in X1. Conditional on X1 and X2, the random error vecto is jointly normal with zero expectation and variance-covariance matrix V, hich does not depend on X1 and X2. V is not a diagonal matrix (i.e., some if-diagonal elements are nonzero). B1 and B2 are vectors of two different ets of regression coefficients; B1 has two regression coefficients and B2 has our regression coefficients. B = (B1 ,B½)'; that is, B is a column vector of x regression coefficients. V is completely known (i.e., the values of all elements of V are given). Let W be a matrix of k rows (k > 1) and four columns of given real numbers. Of interest are the hypotheses Họ: WB2 = 0 versus H1: WB2± 0. %3D i) Is this null hypothesis always testable? Why or why not? [5 points] ii) Consider the case that this null hypothesis is testable. Construct a statistical test and its reiection region for Ho. [10 pointsl
MATLAB: An Introduction with Applications
6th Edition
ISBN:9781119256830
Author:Amos Gilat
Publisher:Amos Gilat
Chapter1: Starting With Matlab
Section: Chapter Questions
Problem 1P
Related questions
Question
When is the null hypothesis not testable and how do construct the test and estimators when V is known and not known?
![Consider the multiple regression model for n y-data yı, ., Yn, (n is sample size)
y = X,B, + X,B2 + ɛ
where y = (y1, .., yn)', X1 and X2 are random except the intercept term (i.e., the
vector of 1) included in X1. Conditional on X1 and X2, the random error vector
ɛ is jointly normal with zero expectation and variance-covariance matrix V,
which does not depend on X1 and X2. V is not a diagonal matrix (i.e., some
off-diagonal elements are nonzero). B1 and B2 are vectors of two different
sets of regression coefficients; B1 has two regression coefficients and B2 has
four regression coefficients. B = (B¡ ,B½)'; that is, B is a column vector of
six regression coefficients.
a) V is completely known (i.e., the values of all elements of V are given).
Let W be a matrix of k rows (k > 1) and four columns of given real
numbers. Of interest are the hypotheses
Ho: WB2 = 0 versus H1: WB2 + 0.
i) Is this null hypothesis always testable? Why or why not? [5 points]
ii) Consider the case that this null hypothesis is testable. Construct a
statistical test and its rejection region for Ho. [10 points]](/v2/_next/image?url=https%3A%2F%2Fcontent.bartleby.com%2Fqna-images%2Fquestion%2Ff4977e05-e1b4-44eb-b5f1-f7eb406c2e05%2F651694f4-147f-4d2c-aef4-a3c0625d9e27%2Fthgpcf_processed.png&w=3840&q=75)
Transcribed Image Text:Consider the multiple regression model for n y-data yı, ., Yn, (n is sample size)
y = X,B, + X,B2 + ɛ
where y = (y1, .., yn)', X1 and X2 are random except the intercept term (i.e., the
vector of 1) included in X1. Conditional on X1 and X2, the random error vector
ɛ is jointly normal with zero expectation and variance-covariance matrix V,
which does not depend on X1 and X2. V is not a diagonal matrix (i.e., some
off-diagonal elements are nonzero). B1 and B2 are vectors of two different
sets of regression coefficients; B1 has two regression coefficients and B2 has
four regression coefficients. B = (B¡ ,B½)'; that is, B is a column vector of
six regression coefficients.
a) V is completely known (i.e., the values of all elements of V are given).
Let W be a matrix of k rows (k > 1) and four columns of given real
numbers. Of interest are the hypotheses
Ho: WB2 = 0 versus H1: WB2 + 0.
i) Is this null hypothesis always testable? Why or why not? [5 points]
ii) Consider the case that this null hypothesis is testable. Construct a
statistical test and its rejection region for Ho. [10 points]
![b) V is completely known (i.e., the values of all elements of V are given).
Construct an estimator of B and discuss the statistical properties (e.g.,
bias, variance) of your estimator. [10 points]
c) Consider the case that the values of V are not completely given.
Construct an estimator of B and derive its variance-covariance matrix.
Can the variance-covariance matrix be unbiasedly estimated? [10 points].](/v2/_next/image?url=https%3A%2F%2Fcontent.bartleby.com%2Fqna-images%2Fquestion%2Ff4977e05-e1b4-44eb-b5f1-f7eb406c2e05%2F651694f4-147f-4d2c-aef4-a3c0625d9e27%2Fpiufnr_processed.png&w=3840&q=75)
Transcribed Image Text:b) V is completely known (i.e., the values of all elements of V are given).
Construct an estimator of B and discuss the statistical properties (e.g.,
bias, variance) of your estimator. [10 points]
c) Consider the case that the values of V are not completely given.
Construct an estimator of B and derive its variance-covariance matrix.
Can the variance-covariance matrix be unbiasedly estimated? [10 points].
Expert Solution

This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
This is a popular solution!
Trending now
This is a popular solution!
Step by step
Solved in 2 steps with 3 images

Recommended textbooks for you

MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc

Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning

Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning

MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc

Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning

Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning

Elementary Statistics: Picturing the World (7th E…
Statistics
ISBN:
9780134683416
Author:
Ron Larson, Betsy Farber
Publisher:
PEARSON

The Basic Practice of Statistics
Statistics
ISBN:
9781319042578
Author:
David S. Moore, William I. Notz, Michael A. Fligner
Publisher:
W. H. Freeman

Introduction to the Practice of Statistics
Statistics
ISBN:
9781319013387
Author:
David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:
W. H. Freeman