(a) Find the probability mass function of (X₁ + X₂). (b) Using the result of (a), guess what the probability mass function of X, will be and i=1 substantiate your guess by using the method of induction. In (c), (d), (e), and (f), consider the problem of testing Ho p = 0.5 versus Ha: p = 0.2 at a = 0.038406. n (c) Write down the likelihood function L(p; x), where x = (x1, x2,,xn) is the data sample realized from the random sample X = (X₁, X2,..., Xn). points. (d) Write down the expression for the Neyman-Pearson test statistic T(x) and the corresponding rejection region T as outlined in the posted notes on Neyman-Pearson Lemma.

MATLAB: An Introduction with Applications
6th Edition
ISBN:9781119256830
Author:Amos Gilat
Publisher:Amos Gilat
Chapter1: Starting With Matlab
Section: Chapter Questions
Problem 1P
icon
Related questions
Question
This problem explores the question of testing
Ho p= 0.5 versus Ho p < 0.5,
based on a random sample X₁, X2, ,, Xn from the Geometric (p) distribution.
Recall that there is an ambiguity surrounding the definition of the Geometric (p)
distribution, so let us unequivocally specify which distribution we are going to work
with. In the experiment of repeatedly tossing a coin with the probability of getting a Head
in a single toss of the coin equal to p = (0, 1) till a Head is obtained, let X denote the
number of Tails obtained before the experiment is terminated. Then X has the
Geometric (p) distribution.
Further recall that X₁, X2,, Xn is a random sample of size n from the Geometric (p)
distribution means X = (X₁, X2,, Xn) is a collection of mutually independent
random variables, with each X, having the Geometric (p) distribution. Before proceeding
further, let us recall the following definition.
Definition A collection of random variables {Y₁, Y2,..., Yr} defined on the same sample
space, with S, denoting the support of Y, for 1 ≤ i ≤r, is said to be mutually
independent if, \ (81, 82, ..., sr) €, S₁, the Cartesian product of (S₁, S2, · · ·, Sr),
P(₁₂(Y₁ = $i)) = [[P(Y₁ = $₁).
1=1
Remember that if {Y₁, Y2,, Yr} is a collection of mutually independent random
variables, then for any 1 ≤ m ≤ r, the random variables
U = g(Y₁, Y2,, Ym) and V = h(Ym+1, Ym+2,, Y₂)
are independent.
Transcribed Image Text:This problem explores the question of testing Ho p= 0.5 versus Ho p < 0.5, based on a random sample X₁, X2, ,, Xn from the Geometric (p) distribution. Recall that there is an ambiguity surrounding the definition of the Geometric (p) distribution, so let us unequivocally specify which distribution we are going to work with. In the experiment of repeatedly tossing a coin with the probability of getting a Head in a single toss of the coin equal to p = (0, 1) till a Head is obtained, let X denote the number of Tails obtained before the experiment is terminated. Then X has the Geometric (p) distribution. Further recall that X₁, X2,, Xn is a random sample of size n from the Geometric (p) distribution means X = (X₁, X2,, Xn) is a collection of mutually independent random variables, with each X, having the Geometric (p) distribution. Before proceeding further, let us recall the following definition. Definition A collection of random variables {Y₁, Y2,..., Yr} defined on the same sample space, with S, denoting the support of Y, for 1 ≤ i ≤r, is said to be mutually independent if, \ (81, 82, ..., sr) €, S₁, the Cartesian product of (S₁, S2, · · ·, Sr), P(₁₂(Y₁ = $i)) = [[P(Y₁ = $₁). 1=1 Remember that if {Y₁, Y2,, Yr} is a collection of mutually independent random variables, then for any 1 ≤ m ≤ r, the random variables U = g(Y₁, Y2,, Ym) and V = h(Ym+1, Ym+2,, Y₂) are independent.
(a) Find the probability mass function of (X₁ + X₂).
n
(b) Using the result of (a), guess what the probability mass function of EX, will be and
i=1
substantiate your guess by using the method of induction.
In (c), (d), (e), and (f), consider the problem of testing
at a = 0.038406.
Ho
:
p
0.5 versus Ha: p = 0.2
(c) Write down the likelihood function L(p; x), where x = (x1, x2,,xn) is the data
sample realized from the random sample X = (X1, X2,..., Xn).
points.
(d) Write down the expression for the Neyman-Pearson test statistic T(x) and the
corresponding rejection region T as outlined in the posted notes on Neyman-Pearson
Lemma.
Transcribed Image Text:(a) Find the probability mass function of (X₁ + X₂). n (b) Using the result of (a), guess what the probability mass function of EX, will be and i=1 substantiate your guess by using the method of induction. In (c), (d), (e), and (f), consider the problem of testing at a = 0.038406. Ho : p 0.5 versus Ha: p = 0.2 (c) Write down the likelihood function L(p; x), where x = (x1, x2,,xn) is the data sample realized from the random sample X = (X1, X2,..., Xn). points. (d) Write down the expression for the Neyman-Pearson test statistic T(x) and the corresponding rejection region T as outlined in the posted notes on Neyman-Pearson Lemma.
Expert Solution
steps

Step by step

Solved in 7 steps with 37 images

Blurred answer
Similar questions
Recommended textbooks for you
MATLAB: An Introduction with Applications
MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc
Probability and Statistics for Engineering and th…
Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning
Statistics for The Behavioral Sciences (MindTap C…
Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning
Elementary Statistics: Picturing the World (7th E…
Elementary Statistics: Picturing the World (7th E…
Statistics
ISBN:
9780134683416
Author:
Ron Larson, Betsy Farber
Publisher:
PEARSON
The Basic Practice of Statistics
The Basic Practice of Statistics
Statistics
ISBN:
9781319042578
Author:
David S. Moore, William I. Notz, Michael A. Fligner
Publisher:
W. H. Freeman
Introduction to the Practice of Statistics
Introduction to the Practice of Statistics
Statistics
ISBN:
9781319013387
Author:
David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:
W. H. Freeman