1. Expectation and Variance The notions of expected value and variance are central in the study of probability. Expected value, roughly, gives a sense of the "center" of a probability distribution. For a discrete random variable X (that is, a random variable which takes on at most countably many values), expected value is defined as E[X] = Exen XP(X = x), where CR is some at most countable set. For a continuous random variable Y with density f(x), the expectation is given by E[Y] = fxf(x)dx. The variance of a random variable X (either continuous or discrete) is given by Var[X] = E(X - E[X])². Roughly, the variance tells us how spread out a distribution is with respect to its center. Please try to answer the following questions about expected value and variance. (a) Let X be a random variable. Show that the variance of X can be expressed as Var[X] = E[X²] - E[X]². (b) Suppose X and Y are independent random variables, i.e., for any events A, B C R, we have P(X EA, Y = B) = P(X € A)P(Y = B) (In terms of densities, if f(x, y) is the joint density of X and Y, independence implies f(x, y) = fx (x) fy (y), where fx and fy are the marginal densities of X and Y). Show that E[XY] = E[X]E[Y]. (c) Suppose X, Y are independent random variables. Show that Var [aX+bY] = a²Var[X] + b²Var[Y]. (The previous part should be useful).

A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
icon
Related questions
Question
1. Expectation and Variance
The notions of expected value and variance are central in the study of probability. Expected value, roughly,
gives a sense of the "center" of a probability distribution. For a discrete random variable X (that is, a random
variable which takes on at most countably many values), expected value is defined as E[X] = Σxen XP(X = x),
where CR is some at most countable set. For a continuous random variable Y with density f(x), the
expectation is given by E[Y] = [xf(x)dx.
The variance of a random variable X (either continuous or discrete) is given by Var[X] = E(X - E[X])².
Roughly, the variance tells us how spread out a distribution is with respect to its center. Please try to answer
the following questions about expected value and variance.
(a) Let X be a random variable. Show that the variance of X can be expressed as Var[X] = E[X²] - E[X]².
(b) Suppose X and Y are independent random variables, i.e., for any events A, B C R, we have
P(X EA, Y E B) = P(X € A)P(Y = B)
(In terms of densities, if f(x, y) is the joint density of X and Y, independence implies f(x, y) = fx (x) fy (y),
where fx and fy are the marginal densities of X and Y). Show that E[XY] = E[X]E[Y].
(c) Suppose X, Y are independent random variables. Show that Var [aX+bY] = a²Var [X] + b²Var [Y]. (The
previous part should be useful).
(d) Let X be a random variable. Which constant B € R minimizes E [(X – B)²]? What does this say about
E[X] as a measure of the center of the distribution of X?
(e) Lastly, suppose X₁,..., X₁ are independent random variables with mean μ and variance o². Define the
sample average to be X = (X₁ + + Xn). What is E[X]? What about Var [X]?
Transcribed Image Text:1. Expectation and Variance The notions of expected value and variance are central in the study of probability. Expected value, roughly, gives a sense of the "center" of a probability distribution. For a discrete random variable X (that is, a random variable which takes on at most countably many values), expected value is defined as E[X] = Σxen XP(X = x), where CR is some at most countable set. For a continuous random variable Y with density f(x), the expectation is given by E[Y] = [xf(x)dx. The variance of a random variable X (either continuous or discrete) is given by Var[X] = E(X - E[X])². Roughly, the variance tells us how spread out a distribution is with respect to its center. Please try to answer the following questions about expected value and variance. (a) Let X be a random variable. Show that the variance of X can be expressed as Var[X] = E[X²] - E[X]². (b) Suppose X and Y are independent random variables, i.e., for any events A, B C R, we have P(X EA, Y E B) = P(X € A)P(Y = B) (In terms of densities, if f(x, y) is the joint density of X and Y, independence implies f(x, y) = fx (x) fy (y), where fx and fy are the marginal densities of X and Y). Show that E[XY] = E[X]E[Y]. (c) Suppose X, Y are independent random variables. Show that Var [aX+bY] = a²Var [X] + b²Var [Y]. (The previous part should be useful). (d) Let X be a random variable. Which constant B € R minimizes E [(X – B)²]? What does this say about E[X] as a measure of the center of the distribution of X? (e) Lastly, suppose X₁,..., X₁ are independent random variables with mean μ and variance o². Define the sample average to be X = (X₁ + + Xn). What is E[X]? What about Var [X]?
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 4 steps with 38 images

Blurred answer
Similar questions
  • SEE MORE QUESTIONS
Recommended textbooks for you
A First Course in Probability (10th Edition)
A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON
A First Course in Probability
A First Course in Probability
Probability
ISBN:
9780321794772
Author:
Sheldon Ross
Publisher:
PEARSON