Question 4 Consider flipping a (biased) coin for which the probability of head is p. The fraction of heads after n independent tosses is Xn. Law of large numbers imply that Xn → p as n → o. This does not mean that Xn will exactly equal to p, but rather the distribution of X is tightly concentrated around p for large n. (a) Suppose 0.1 0.95? Note: The purpose of this question is to enhance your understanding of Chebyshev's inequality. In practice, the bound provided by Chebyshev's inequality is usually very "loose", in the sense that n actually only needs to be much smaller for P(0.5 < Xn < 0.7) > 0.95 to hold. A tighter bound for the setting we considered can be obtained by Hoeffding's inequality (Example 6.15 in Wasserman) or Central Limit Theorem (we will study this in class). Note that the bounds from Chebyshev's inequality or Hoeffding's inequality are finite sample in that they hold for any finite n. On the contrary, the bound from Central Limit Theorem is asymptotic, in that it is a statement about n o, and it only holds approximately for finite n.

MATLAB: An Introduction with Applications
6th Edition
ISBN:9781119256830
Author:Amos Gilat
Publisher:Amos Gilat
Chapter1: Starting With Matlab
Section: Chapter Questions
Problem 1P
icon
Related questions
Question
Show work
Question 4
Consider flipping a (biased) coin for which the probability of head is p. The fraction of heads
after n independent tosses is Xn. Law of large numbers imply that Xn → p as n → o. This
does not mean that Xn will exactly equal to p, but rather the distribution of X is tightly
concentrated around p for large n.
(a)
Suppose 0.1 < p< 0.9. Use Chebyshev's inequality to obtain a lower bound
on
P(p- 0.1 < X, <p+0.1).
(b)
Suppose p 0.6. Using the above lower bound derived using Chebyshev's
inequality, how large should n be so that P(0.5 < Xn < 0.7) > 0.95?
Note: The purpose of this question is to enhance your understanding of Chebyshev's
inequality. In practice, the bound provided by Chebyshev's inequality is usually very "loose",
in the sense that n actually only needs to be much smaller for P(0.5 < Xn < 0.7) > 0.95 to
hold. A tighter bound for the setting we considered can be obtained by Hoeffding's inequality
(Example 6.15 in Wasserman) or Central Limit Theorem (we will study this in class). Note
that the bounds from Chebyshev's inequality or Hoeffding's inequality are finite sample in
that they hold for any finite n. On the contrary, the bound from Central Limit Theorem
is asymptotic, in that it is a statement about n o, and it only holds approximately for
finite n.
Transcribed Image Text:Question 4 Consider flipping a (biased) coin for which the probability of head is p. The fraction of heads after n independent tosses is Xn. Law of large numbers imply that Xn → p as n → o. This does not mean that Xn will exactly equal to p, but rather the distribution of X is tightly concentrated around p for large n. (a) Suppose 0.1 < p< 0.9. Use Chebyshev's inequality to obtain a lower bound on P(p- 0.1 < X, <p+0.1). (b) Suppose p 0.6. Using the above lower bound derived using Chebyshev's inequality, how large should n be so that P(0.5 < Xn < 0.7) > 0.95? Note: The purpose of this question is to enhance your understanding of Chebyshev's inequality. In practice, the bound provided by Chebyshev's inequality is usually very "loose", in the sense that n actually only needs to be much smaller for P(0.5 < Xn < 0.7) > 0.95 to hold. A tighter bound for the setting we considered can be obtained by Hoeffding's inequality (Example 6.15 in Wasserman) or Central Limit Theorem (we will study this in class). Note that the bounds from Chebyshev's inequality or Hoeffding's inequality are finite sample in that they hold for any finite n. On the contrary, the bound from Central Limit Theorem is asymptotic, in that it is a statement about n o, and it only holds approximately for finite n.
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 2 steps with 2 images

Blurred answer
Recommended textbooks for you
MATLAB: An Introduction with Applications
MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc
Probability and Statistics for Engineering and th…
Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning
Statistics for The Behavioral Sciences (MindTap C…
Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning
Elementary Statistics: Picturing the World (7th E…
Elementary Statistics: Picturing the World (7th E…
Statistics
ISBN:
9780134683416
Author:
Ron Larson, Betsy Farber
Publisher:
PEARSON
The Basic Practice of Statistics
The Basic Practice of Statistics
Statistics
ISBN:
9781319042578
Author:
David S. Moore, William I. Notz, Michael A. Fligner
Publisher:
W. H. Freeman
Introduction to the Practice of Statistics
Introduction to the Practice of Statistics
Statistics
ISBN:
9781319013387
Author:
David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:
W. H. Freeman