1. Expectation and Variance The notions of expected value and variance are central in the study of probability. Expected value, roughly, gives a sense of the "center" of a probability distribution. For a discrete random variable X (that is, a random variable which takes on at most countably many values), expected value is defined as E[X] = Exen XP(X = x), where CR is some at most countable set. For a continuous random variable Y with density f(x), the expectation is given by E[Y] = fxf(x)dx. The variance of a random variable X (either continuous or discrete) is given by Var[X] = E(X - E[X])². Roughly, the variance tells us how spread out a distribution is with respect to its center. Please try to answer the following questions about expected value and variance. (a) Let X be a random variable. Show that the variance of X can be expressed as Var[X] = E[X²] - E[X]². (b) Suppose X and Y are independent random variables, i.e., for any events A, B C R, we have P(X EA, Y = B) = P(X € A)P(Y = B) (In terms of densities, if f(x, y) is the joint density of X and Y, independence implies f(x, y) = fx (x) fy (y), where fx and fy are the marginal densities of X and Y). Show that E[XY] = E[X]E[Y]. (c) Suppose X, Y are independent random variables. Show that Var [aX+bY] = a²Var[X] + b²Var[Y]. (The previous part should be useful).
1. Expectation and Variance The notions of expected value and variance are central in the study of probability. Expected value, roughly, gives a sense of the "center" of a probability distribution. For a discrete random variable X (that is, a random variable which takes on at most countably many values), expected value is defined as E[X] = Exen XP(X = x), where CR is some at most countable set. For a continuous random variable Y with density f(x), the expectation is given by E[Y] = fxf(x)dx. The variance of a random variable X (either continuous or discrete) is given by Var[X] = E(X - E[X])². Roughly, the variance tells us how spread out a distribution is with respect to its center. Please try to answer the following questions about expected value and variance. (a) Let X be a random variable. Show that the variance of X can be expressed as Var[X] = E[X²] - E[X]². (b) Suppose X and Y are independent random variables, i.e., for any events A, B C R, we have P(X EA, Y = B) = P(X € A)P(Y = B) (In terms of densities, if f(x, y) is the joint density of X and Y, independence implies f(x, y) = fx (x) fy (y), where fx and fy are the marginal densities of X and Y). Show that E[XY] = E[X]E[Y]. (c) Suppose X, Y are independent random variables. Show that Var [aX+bY] = a²Var[X] + b²Var[Y]. (The previous part should be useful).
A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
Related questions
Question
![1. Expectation and Variance
The notions of expected value and variance are central in the study of probability. Expected value, roughly,
gives a sense of the "center" of a probability distribution. For a discrete random variable X (that is, a random
variable which takes on at most countably many values), expected value is defined as E[X] = Σxen XP(X = x),
where CR is some at most countable set. For a continuous random variable Y with density f(x), the
expectation is given by E[Y] = [xf(x)dx.
The variance of a random variable X (either continuous or discrete) is given by Var[X] = E(X - E[X])².
Roughly, the variance tells us how spread out a distribution is with respect to its center. Please try to answer
the following questions about expected value and variance.
(a) Let X be a random variable. Show that the variance of X can be expressed as Var[X] = E[X²] - E[X]².
(b) Suppose X and Y are independent random variables, i.e., for any events A, B C R, we have
P(X EA, Y E B) = P(X € A)P(Y = B)
(In terms of densities, if f(x, y) is the joint density of X and Y, independence implies f(x, y) = fx (x) fy (y),
where fx and fy are the marginal densities of X and Y). Show that E[XY] = E[X]E[Y].
(c) Suppose X, Y are independent random variables. Show that Var [aX+bY] = a²Var [X] + b²Var [Y]. (The
previous part should be useful).
(d) Let X be a random variable. Which constant B € R minimizes E [(X – B)²]? What does this say about
E[X] as a measure of the center of the distribution of X?
(e) Lastly, suppose X₁,..., X₁ are independent random variables with mean μ and variance o². Define the
sample average to be X = (X₁ + + Xn). What is E[X]? What about Var [X]?](/v2/_next/image?url=https%3A%2F%2Fcontent.bartleby.com%2Fqna-images%2Fquestion%2Fd1ceeb10-3a2e-403a-927f-8b66ea513861%2F9e125cb4-0a95-4710-8f88-a4766e8993eb%2Fpsuruyk_processed.png&w=3840&q=75)
Transcribed Image Text:1. Expectation and Variance
The notions of expected value and variance are central in the study of probability. Expected value, roughly,
gives a sense of the "center" of a probability distribution. For a discrete random variable X (that is, a random
variable which takes on at most countably many values), expected value is defined as E[X] = Σxen XP(X = x),
where CR is some at most countable set. For a continuous random variable Y with density f(x), the
expectation is given by E[Y] = [xf(x)dx.
The variance of a random variable X (either continuous or discrete) is given by Var[X] = E(X - E[X])².
Roughly, the variance tells us how spread out a distribution is with respect to its center. Please try to answer
the following questions about expected value and variance.
(a) Let X be a random variable. Show that the variance of X can be expressed as Var[X] = E[X²] - E[X]².
(b) Suppose X and Y are independent random variables, i.e., for any events A, B C R, we have
P(X EA, Y E B) = P(X € A)P(Y = B)
(In terms of densities, if f(x, y) is the joint density of X and Y, independence implies f(x, y) = fx (x) fy (y),
where fx and fy are the marginal densities of X and Y). Show that E[XY] = E[X]E[Y].
(c) Suppose X, Y are independent random variables. Show that Var [aX+bY] = a²Var [X] + b²Var [Y]. (The
previous part should be useful).
(d) Let X be a random variable. Which constant B € R minimizes E [(X – B)²]? What does this say about
E[X] as a measure of the center of the distribution of X?
(e) Lastly, suppose X₁,..., X₁ are independent random variables with mean μ and variance o². Define the
sample average to be X = (X₁ + + Xn). What is E[X]? What about Var [X]?
Expert Solution

This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
This is a popular solution!
Trending now
This is a popular solution!
Step by step
Solved in 4 steps with 38 images

Recommended textbooks for you

A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON


A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON
