4. Let X, Y be non-negative continuous random variables with probability density functions (pdf) gx(x) and gy (y), respectively. Further, let f(x, y) denote their joint pdf. We say that X and Y are independent if f(x, y) = gx (x)hy (y) for all x, y ≥ 0. Further, we define the expectation of X to be E[X] = [**rg(x)dx, with a similar definition for Y but g replaced by h and x replaced by y. We also define E[XY] = (0,00)x (0,00) Ty f(x, y)dzdy to be the expectation of XY. Use Fubini's theorem (which you may assume holds) to show that if X and Y are independent, then E[XY] = E[X]E[Y].
4. Let X, Y be non-negative continuous random variables with probability density functions (pdf) gx(x) and gy (y), respectively. Further, let f(x, y) denote their joint pdf. We say that X and Y are independent if f(x, y) = gx (x)hy (y) for all x, y ≥ 0. Further, we define the expectation of X to be E[X] = [**rg(x)dx, with a similar definition for Y but g replaced by h and x replaced by y. We also define E[XY] = (0,00)x (0,00) Ty f(x, y)dzdy to be the expectation of XY. Use Fubini's theorem (which you may assume holds) to show that if X and Y are independent, then E[XY] = E[X]E[Y].
A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
Related questions
Question
![4. Let X, Y be non-negative continuous random variables with probability density functions (pdf) gx(x)
and gy (y), respectively. Further, let f(x, y) denote their joint pdf. We say that X and Y are independent
if f(x, y) = gx (x)hy (y) for all x, y ≥ 0.
Further, we define the expectation of X to be
- 1.²0⁰
with a similar definition for Y but g replaced by h and x replaced by y. We also define
to be the expectation of XY.
E[X] =
xg(x) dx,
E(XY)= (0,00)x(0,00) Tuf(x,y)dady
(0,∞) (0,∞)
Use Fubini's theorem (which you may assume holds) to show that if X and Y are independent, then
E[XY] = E[X]E[Y].](/v2/_next/image?url=https%3A%2F%2Fcontent.bartleby.com%2Fqna-images%2Fquestion%2Fb41de797-8c36-43f3-a49e-0d77bbbd163e%2F19ee0224-9700-4712-80ea-8215664f1d13%2Fjxa3pj_processed.jpeg&w=3840&q=75)
Transcribed Image Text:4. Let X, Y be non-negative continuous random variables with probability density functions (pdf) gx(x)
and gy (y), respectively. Further, let f(x, y) denote their joint pdf. We say that X and Y are independent
if f(x, y) = gx (x)hy (y) for all x, y ≥ 0.
Further, we define the expectation of X to be
- 1.²0⁰
with a similar definition for Y but g replaced by h and x replaced by y. We also define
to be the expectation of XY.
E[X] =
xg(x) dx,
E(XY)= (0,00)x(0,00) Tuf(x,y)dady
(0,∞) (0,∞)
Use Fubini's theorem (which you may assume holds) to show that if X and Y are independent, then
E[XY] = E[X]E[Y].
Expert Solution

This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
Step by step
Solved in 3 steps with 1 images

Similar questions
Recommended textbooks for you

A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON


A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON
