(a) Find E[Z] and Var(Z). (b) Using properties of the conditional expectation, find E[Z | F]. Deduce that E[Z | F] = E[Z]. It is possible to prove that Z~ N(0, 1) (see below). We have seen in the course that if X is independent of F, E[X | F] =E[X]. Is the reverse implication also correct? Namely, if we know that E[X | F] = == E[X], can we deduce that X is independent of F? This problem shows that this is NOT the case. Let X N(0, 1) be a standard normal random variable. Let Y be a random variable such that ~ - P(Y = 1) = P(Y = −1) = 1/2. Assume that X and Y are independent and let Z = XY. Let F = σ(X). Notice that X is always σ(X)-measurable and that Y is independent of F since it is independent of X.

Algebra & Trigonometry with Analytic Geometry
13th Edition
ISBN:9781133382119
Author:Swokowski
Publisher:Swokowski
Chapter10: Sequences, Series, And Probability
Section10.8: Probability
Problem 31E
icon
Related questions
Question
(a) Find E[Z] and Var(Z).
(b) Using properties of the conditional expectation, find E[Z | F]. Deduce that
E[Z | F] = E[Z].
It is possible to prove that Z~ N(0, 1) (see below).
Transcribed Image Text:(a) Find E[Z] and Var(Z). (b) Using properties of the conditional expectation, find E[Z | F]. Deduce that E[Z | F] = E[Z]. It is possible to prove that Z~ N(0, 1) (see below).
We have seen in the course that if X is independent of F, E[X | F] =E[X]. Is the
reverse implication also correct? Namely, if we know that E[X | F] =
== E[X], can we
deduce that X is independent of F? This problem shows that this is NOT the case.
Let X N(0, 1) be a standard normal random variable. Let Y be a random variable
such that
~
-
P(Y = 1) = P(Y = −1) = 1/2.
Assume that X and Y are independent and let Z = XY. Let F = σ(X). Notice that X
is always σ(X)-measurable and that Y is independent of F since it is independent of X.
Transcribed Image Text:We have seen in the course that if X is independent of F, E[X | F] =E[X]. Is the reverse implication also correct? Namely, if we know that E[X | F] = == E[X], can we deduce that X is independent of F? This problem shows that this is NOT the case. Let X N(0, 1) be a standard normal random variable. Let Y be a random variable such that ~ - P(Y = 1) = P(Y = −1) = 1/2. Assume that X and Y are independent and let Z = XY. Let F = σ(X). Notice that X is always σ(X)-measurable and that Y is independent of F since it is independent of X.
Expert Solution
steps

Step by step

Solved in 2 steps with 7 images

Blurred answer
Recommended textbooks for you
Algebra & Trigonometry with Analytic Geometry
Algebra & Trigonometry with Analytic Geometry
Algebra
ISBN:
9781133382119
Author:
Swokowski
Publisher:
Cengage
College Algebra
College Algebra
Algebra
ISBN:
9781305115545
Author:
James Stewart, Lothar Redlin, Saleem Watson
Publisher:
Cengage Learning
Holt Mcdougal Larson Pre-algebra: Student Edition…
Holt Mcdougal Larson Pre-algebra: Student Edition…
Algebra
ISBN:
9780547587776
Author:
HOLT MCDOUGAL
Publisher:
HOLT MCDOUGAL
College Algebra
College Algebra
Algebra
ISBN:
9781337282291
Author:
Ron Larson
Publisher:
Cengage Learning