Problem 3. In this problem, we are going to do both maximum likelihood estimation and Bayesian estimation. (a) We have one unknown parameter 0. We draw X1, X2,..., Xg independently from a Bernoulli(0) distribution. Suppose X₁ = X2 = X5 = 1 and X3 = X₁ = X6 = X7 = X8 = 0, i.e. we get 3 successes and 5 failures. What is the likelihood function L(0) and what is the log-likelihood function In L(0)? (b) What's the maximum likelihood estimator of given this data? (c) Suppose we have a prior that = 0.75 with probability 0.6 and that = 0.25 with probability 0.4. What is the posterior distribution in this case? What is the maximum a posteriori estimator? (d) Instead of the prior in Part (c), suppose instead we have a prior that is uniformly distributed on [0,1]. What is the posterior distribution in this case? What is the maximum a posteriori estimator?

MATLAB: An Introduction with Applications
6th Edition
ISBN:9781119256830
Author:Amos Gilat
Publisher:Amos Gilat
Chapter1: Starting With Matlab
Section: Chapter Questions
Problem 1P
icon
Related questions
Question

do solve in 1-2 hours

Problem 3. In this problem, we are going to do both maximum likelihood estimation and Bayesian
estimation.
(a) We have one unknown parameter 0. We draw X1, X2,..., X8 independently from a Bernoulli (0)
distribution. Suppose X₁ = X2 = X5 = 1 and X3 = X₁ = X6 = X7 = X8 = 0, i.e. we get 3 successes
and 5 failures. What is the likelihood function L(0) and what is the log-likelihood function In L(0)?
(b) What's the maximum likelihood estimator of given this data?
(c) Suppose we have a prior that = 0.75 with probability 0.6 and that = 0.25 with probability 0.4.
What is the posterior distribution in this case? What is the maximum a posteriori estimator?
(d) Instead of the prior in Part (c), suppose instead we have a prior that is uniformly distributed on
[0,1]. What is the posterior distribution in this case? What is the maximum a posteriori estimator?
Transcribed Image Text:Problem 3. In this problem, we are going to do both maximum likelihood estimation and Bayesian estimation. (a) We have one unknown parameter 0. We draw X1, X2,..., X8 independently from a Bernoulli (0) distribution. Suppose X₁ = X2 = X5 = 1 and X3 = X₁ = X6 = X7 = X8 = 0, i.e. we get 3 successes and 5 failures. What is the likelihood function L(0) and what is the log-likelihood function In L(0)? (b) What's the maximum likelihood estimator of given this data? (c) Suppose we have a prior that = 0.75 with probability 0.6 and that = 0.25 with probability 0.4. What is the posterior distribution in this case? What is the maximum a posteriori estimator? (d) Instead of the prior in Part (c), suppose instead we have a prior that is uniformly distributed on [0,1]. What is the posterior distribution in this case? What is the maximum a posteriori estimator?
Expert Solution
steps

Step by step

Solved in 7 steps with 32 images

Blurred answer
Recommended textbooks for you
MATLAB: An Introduction with Applications
MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc
Probability and Statistics for Engineering and th…
Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning
Statistics for The Behavioral Sciences (MindTap C…
Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning
Elementary Statistics: Picturing the World (7th E…
Elementary Statistics: Picturing the World (7th E…
Statistics
ISBN:
9780134683416
Author:
Ron Larson, Betsy Farber
Publisher:
PEARSON
The Basic Practice of Statistics
The Basic Practice of Statistics
Statistics
ISBN:
9781319042578
Author:
David S. Moore, William I. Notz, Michael A. Fligner
Publisher:
W. H. Freeman
Introduction to the Practice of Statistics
Introduction to the Practice of Statistics
Statistics
ISBN:
9781319013387
Author:
David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:
W. H. Freeman