(i) Show that for any RV's X₁, X2, ···, Xn, n Var (X₁ + X₂ + + Xn) = Cov(Xį, Xj). ΣΣ i=1 j=1 n (Hint: use Var X = Cov(X, X) and expand using linearity in both "slots".) (ii) Show that if X₁, X₂, …, X₂ are independent, Var(X₁ + X₂ + · + Xn) = Var(X₁) + Var(X₂) + • + Var(Xn). (Hint: recall that Cov(X, Y) = 0 if X, Y are independent.) (iii) If X₁, X2, · · , Xn are also identically distributed as X, show that⁹ Var X = ¹ Var X. X, n where X is the sample mean as a random variable 1 X = -— (X₁ + X₂ + ··· + Xn). n
(i) Show that for any RV's X₁, X2, ···, Xn, n Var (X₁ + X₂ + + Xn) = Cov(Xį, Xj). ΣΣ i=1 j=1 n (Hint: use Var X = Cov(X, X) and expand using linearity in both "slots".) (ii) Show that if X₁, X₂, …, X₂ are independent, Var(X₁ + X₂ + · + Xn) = Var(X₁) + Var(X₂) + • + Var(Xn). (Hint: recall that Cov(X, Y) = 0 if X, Y are independent.) (iii) If X₁, X2, · · , Xn are also identically distributed as X, show that⁹ Var X = ¹ Var X. X, n where X is the sample mean as a random variable 1 X = -— (X₁ + X₂ + ··· + Xn). n
A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
Related questions
Question
![Recall the covariance of two RV's X and Y is defined by
Cov(X, Y): E[(X – E X)(Y — EY)] = E[XY] – (EX)(EY).
:=
-
It has the following properties6:
• Cov(X, X) = Var(X).
● Symmetry: Cov(X,Y)= Cov(Y, X).
Linearity in the "first slot": for any real numbers a, b,
Cov(aX+b, Y) = a Cov(X,Y)
and for any RV's X₁, X2,
Cov(X₁ + X₂, Y) = Cov(X₁, Y) + Cov(X₂, Y).
● Linearity in the "second slot": for any real numbers a, b,
Cov(X, aY + b) = a Cov(X,Y)
and for any RV's Y₁, Y2,
Cov(X, Y₁+Y₂) = Cov(X, Y₁) + Cov(X, Y₂).](/v2/_next/image?url=https%3A%2F%2Fcontent.bartleby.com%2Fqna-images%2Fquestion%2F33efa0ee-e3c4-4640-bf2a-d63d72536f00%2Ff1b3062b-0b9d-46f6-a7dd-d9622a8747d7%2Fd6j4cc_processed.png&w=3840&q=75)
Transcribed Image Text:Recall the covariance of two RV's X and Y is defined by
Cov(X, Y): E[(X – E X)(Y — EY)] = E[XY] – (EX)(EY).
:=
-
It has the following properties6:
• Cov(X, X) = Var(X).
● Symmetry: Cov(X,Y)= Cov(Y, X).
Linearity in the "first slot": for any real numbers a, b,
Cov(aX+b, Y) = a Cov(X,Y)
and for any RV's X₁, X2,
Cov(X₁ + X₂, Y) = Cov(X₁, Y) + Cov(X₂, Y).
● Linearity in the "second slot": for any real numbers a, b,
Cov(X, aY + b) = a Cov(X,Y)
and for any RV's Y₁, Y2,
Cov(X, Y₁+Y₂) = Cov(X, Y₁) + Cov(X, Y₂).

Transcribed Image Text:(i) Show that for any RV's X₁, X2,..., Xn,
n
Var(X₁ + X2 + ··· + Xn) = ➤➤Cov (X₁, Xj).
i=1_j=1
η
(Hint: use Var X = Cov(X, X) and expand using linearity in both "slots".)
(ii) Show that if X₁, X2,..., Xn are independent,
Var(X₁ + X₂ + ... + Xn) = Var(X₁) + Var(X₂) + ... + Var(X₂).
(Hint: recall that Cov(X, Y) = 0 if X, Y are independent.)
(iii) If X₁, X₂,..., Xn are also identically distributed as X, show that
1
Var X = = Var X,
η
where X is the sample mean as a random variable
1
X = ¹² ( X₁ + X₂ + ... + Xn).
n
Expert Solution

This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
This is a popular solution!
Trending now
This is a popular solution!
Step by step
Solved in 2 steps with 3 images

Recommended textbooks for you

A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON


A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON
