1. Let X, Y be non-negative continuous random variables with probability density functions (pdf) gx(x) and gy (y), respectively. Further, let f(x, y) denote their joint pdf. We say that X and Y are independent f f(x, y) = gx(x)hy (y) for all x, y ≥ 0. Further, we define the expectation of X to be E[X] = √ ag(a)da, with a similar definition for Y but g replaced by h and x replaced by y. We also define E[XY] = (0,0) (0,00) Tuf(x, y)dady (0,∞) (0,∞) to be the expectation of XY. Use Fubini's theorem (which you may assume holds) to show that if X and Y are independent, then E[XY] = E[X]E[Y]. [2]
1. Let X, Y be non-negative continuous random variables with probability density functions (pdf) gx(x) and gy (y), respectively. Further, let f(x, y) denote their joint pdf. We say that X and Y are independent f f(x, y) = gx(x)hy (y) for all x, y ≥ 0. Further, we define the expectation of X to be E[X] = √ ag(a)da, with a similar definition for Y but g replaced by h and x replaced by y. We also define E[XY] = (0,0) (0,00) Tuf(x, y)dady (0,∞) (0,∞) to be the expectation of XY. Use Fubini's theorem (which you may assume holds) to show that if X and Y are independent, then E[XY] = E[X]E[Y]. [2]
A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
Related questions
Question
![4. Let X, Y be non-negative continuous random variables with probability density functions (pdf) gx(x)
and gy (y), respectively. Further, let f(x, y) denote their joint pdf. We say that X and Y are independent
if f(x, y) = 9x(x)hy (y) for all x, y ≥ 0.
Further, we define the expectation of X to be
E[X] = √rg(x)dx,
to be the expectation of XY.
0
with a similar definition for Y but g replaced by h and x replaced by y. We also define
E[XY] = (0,00)x (0,00)
110,00)x (0,00) 29 (x, y) dedy
(0,∞)
Use Fubini's theorem (which you may assume holds) to show that if X and Y are independent, then
E[XY] = E[X]E[Y].
[2]](/v2/_next/image?url=https%3A%2F%2Fcontent.bartleby.com%2Fqna-images%2Fquestion%2F4a97bb2c-6d83-4604-9a70-f559a7370826%2F23374e2d-b3a3-4635-9cc7-1c509e229bdf%2F88uq4jl_processed.png&w=3840&q=75)
Transcribed Image Text:4. Let X, Y be non-negative continuous random variables with probability density functions (pdf) gx(x)
and gy (y), respectively. Further, let f(x, y) denote their joint pdf. We say that X and Y are independent
if f(x, y) = 9x(x)hy (y) for all x, y ≥ 0.
Further, we define the expectation of X to be
E[X] = √rg(x)dx,
to be the expectation of XY.
0
with a similar definition for Y but g replaced by h and x replaced by y. We also define
E[XY] = (0,00)x (0,00)
110,00)x (0,00) 29 (x, y) dedy
(0,∞)
Use Fubini's theorem (which you may assume holds) to show that if X and Y are independent, then
E[XY] = E[X]E[Y].
[2]
Expert Solution

This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
This is a popular solution!
Trending now
This is a popular solution!
Step by step
Solved in 3 steps with 1 images

Recommended textbooks for you

A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON


A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON
