2. Fix > 0, let X be a Poisson random variable (RV), i.e., X~ Po(A), and let Y = X{X>0} denote the RV X conditioned on the event {X>0}. Clearly, the support of Y is N = {1,2,3,...}. Please do the following: (a) (b) Derive the probability mass function of the RV Y, i.e., give py(k) in terms of k, A. Prove the PMF is valid, i.e., show that it sums to one. Hint: recall the series representation 0*/k!. et = -0 (c) Derive an expression for the expected value of Y, i.e., E[Y], in terms of A. Hint: use the total expectation theorem for E[X], conditioning on the partition {X>0} and {X=0}, recognizing E[Y] = E[X|X>0].

A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
icon
Related questions
Question
Certainly! Here’s a transcription of the text suitable for an educational website:

---

**Problem 2:**

Fix \( \lambda > 0 \), let \( X \) be a Poisson random variable (RV), i.e., \( X \sim \text{Po}(\lambda) \), and let \( Y \overset{\Delta}{=} X \mid \{X > 0\} \) denote the RV \( X \) conditioned on the event \(\{X > 0\}\). Clearly, the support of \( Y \) is \( \mathbb{N} = \{1, 2, 3, \ldots\}\). Please do the following:

(a) **Derive the probability mass function** of the RV \( Y \), i.e., give \( p_Y(k) \) in terms of \( k, \lambda \).

(b) **Prove the PMF is valid**, i.e., show that it sums to one. Hint: recall the series representation
   \[
   e^x = \sum_{k=0}^{\infty} \frac{x^k}{k!}.
   \]

(c) **Derive an expression for the expected value of \( Y \)**, i.e., \(\mathbb{E}[Y]\), in terms of \( \lambda \). Hint: use the total expectation theorem for \(\mathbb{E}[X]\), conditioning on the partition \(\{X > 0\}\) and \(\{X = 0\}\), recognizing \(\mathbb{E}[Y] = \mathbb{E}[X \mid X > 0]\).

(d) **Derive an expression for the expected squared value of \( Y \)**, i.e., \(\mathbb{E}[Y^2]\), in terms of \( \lambda \). Hint: use the total expectation theorem for \(\mathbb{E}[X^2]\), conditioning on the partition \(\{X > 0\}\) and \(\{X = 0\}\), recognizing \(\mathbb{E}[Y^2] = \mathbb{E}[X^2 \mid X > 0]\).

(e) **Derive an expression for the variance of \( Y \)**, \(\text{var}(Y)\), in terms of \( \lambda \). Hint
Transcribed Image Text:Certainly! Here’s a transcription of the text suitable for an educational website: --- **Problem 2:** Fix \( \lambda > 0 \), let \( X \) be a Poisson random variable (RV), i.e., \( X \sim \text{Po}(\lambda) \), and let \( Y \overset{\Delta}{=} X \mid \{X > 0\} \) denote the RV \( X \) conditioned on the event \(\{X > 0\}\). Clearly, the support of \( Y \) is \( \mathbb{N} = \{1, 2, 3, \ldots\}\). Please do the following: (a) **Derive the probability mass function** of the RV \( Y \), i.e., give \( p_Y(k) \) in terms of \( k, \lambda \). (b) **Prove the PMF is valid**, i.e., show that it sums to one. Hint: recall the series representation \[ e^x = \sum_{k=0}^{\infty} \frac{x^k}{k!}. \] (c) **Derive an expression for the expected value of \( Y \)**, i.e., \(\mathbb{E}[Y]\), in terms of \( \lambda \). Hint: use the total expectation theorem for \(\mathbb{E}[X]\), conditioning on the partition \(\{X > 0\}\) and \(\{X = 0\}\), recognizing \(\mathbb{E}[Y] = \mathbb{E}[X \mid X > 0]\). (d) **Derive an expression for the expected squared value of \( Y \)**, i.e., \(\mathbb{E}[Y^2]\), in terms of \( \lambda \). Hint: use the total expectation theorem for \(\mathbb{E}[X^2]\), conditioning on the partition \(\{X > 0\}\) and \(\{X = 0\}\), recognizing \(\mathbb{E}[Y^2] = \mathbb{E}[X^2 \mid X > 0]\). (e) **Derive an expression for the variance of \( Y \)**, \(\text{var}(Y)\), in terms of \( \lambda \). Hint
Expert Solution
steps

Step by step

Solved in 4 steps

Blurred answer
Similar questions
Recommended textbooks for you
A First Course in Probability (10th Edition)
A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON
A First Course in Probability
A First Course in Probability
Probability
ISBN:
9780321794772
Author:
Sheldon Ross
Publisher:
PEARSON