(a) Find E[Z] and Var(Z). (b) Using properties of the conditional expectation, find E[Z | F]. Deduce that E[Z | F] = E[Z]. It is possible to prove that Z~ N(0, 1) (see below). We have seen in the course that if X is independent of F, E[X | F] =E[X]. Is the reverse implication also correct? Namely, if we know that E[X | F] = == E[X], can we deduce that X is independent of F? This problem shows that this is NOT the case. Let X N(0, 1) be a standard normal random variable. Let Y be a random variable such that ~ - P(Y = 1) = P(Y = −1) = 1/2. Assume that X and Y are independent and let Z = XY. Let F = σ(X). Notice that X is always σ(X)-measurable and that Y is independent of F since it is independent of X.

Calculus: Early Transcendentals
8th Edition
ISBN:9781285741550
Author:James Stewart
Publisher:James Stewart
Chapter1: Functions And Models
Section: Chapter Questions
Problem 1RCC: (a) What is a function? What are its domain and range? (b) What is the graph of a function? (c) How...
icon
Related questions
Question
(a) Find E[Z] and Var(Z).
(b) Using properties of the conditional expectation, find E[Z | F]. Deduce that
E[Z | F] = E[Z].
It is possible to prove that Z~ N(0, 1) (see below).
Transcribed Image Text:(a) Find E[Z] and Var(Z). (b) Using properties of the conditional expectation, find E[Z | F]. Deduce that E[Z | F] = E[Z]. It is possible to prove that Z~ N(0, 1) (see below).
We have seen in the course that if X is independent of F, E[X | F] =E[X]. Is the
reverse implication also correct? Namely, if we know that E[X | F] =
== E[X], can we
deduce that X is independent of F? This problem shows that this is NOT the case.
Let X N(0, 1) be a standard normal random variable. Let Y be a random variable
such that
~
-
P(Y = 1) = P(Y = −1) = 1/2.
Assume that X and Y are independent and let Z = XY. Let F = σ(X). Notice that X
is always σ(X)-measurable and that Y is independent of F since it is independent of X.
Transcribed Image Text:We have seen in the course that if X is independent of F, E[X | F] =E[X]. Is the reverse implication also correct? Namely, if we know that E[X | F] = == E[X], can we deduce that X is independent of F? This problem shows that this is NOT the case. Let X N(0, 1) be a standard normal random variable. Let Y be a random variable such that ~ - P(Y = 1) = P(Y = −1) = 1/2. Assume that X and Y are independent and let Z = XY. Let F = σ(X). Notice that X is always σ(X)-measurable and that Y is independent of F since it is independent of X.
Expert Solution
steps

Step by step

Solved in 2 steps with 7 images

Blurred answer
Similar questions
Recommended textbooks for you
Calculus: Early Transcendentals
Calculus: Early Transcendentals
Calculus
ISBN:
9781285741550
Author:
James Stewart
Publisher:
Cengage Learning
Thomas' Calculus (14th Edition)
Thomas' Calculus (14th Edition)
Calculus
ISBN:
9780134438986
Author:
Joel R. Hass, Christopher E. Heil, Maurice D. Weir
Publisher:
PEARSON
Calculus: Early Transcendentals (3rd Edition)
Calculus: Early Transcendentals (3rd Edition)
Calculus
ISBN:
9780134763644
Author:
William L. Briggs, Lyle Cochran, Bernard Gillett, Eric Schulz
Publisher:
PEARSON
Calculus: Early Transcendentals
Calculus: Early Transcendentals
Calculus
ISBN:
9781319050740
Author:
Jon Rogawski, Colin Adams, Robert Franzosa
Publisher:
W. H. Freeman
Precalculus
Precalculus
Calculus
ISBN:
9780135189405
Author:
Michael Sullivan
Publisher:
PEARSON
Calculus: Early Transcendental Functions
Calculus: Early Transcendental Functions
Calculus
ISBN:
9781337552516
Author:
Ron Larson, Bruce H. Edwards
Publisher:
Cengage Learning