NOTE. For positive integers a and b, you may assume that (а- 1)! (ь — 1)! (а +b—1)! za-1(1 – 2)*-1 dz = 1. Consider a 6-sided dice in which a dice roll has a probability 0 of showing a 6 and a probability (1 – 0) of showing some other number (1, 2, 3, 4, 5), where 0 < 0 < 1. For a series of dice rolls, let x1,.., In be the number of dice rolls between successive 6s, so the dice roll sequence 1,6, 1,3, 5, 2, 6, ... would have r1 = 2, x2 = 5 and so on, with sample mean I = (r+...+In). %3D (a) Show that the likelihood function L(0; x) for the parameter 0 given = (x1,..., In) is L(0; x) = 0" (1 – 0)"(z-1), and show 1 the data x that the Maximum Likelihood (ML) estimate 0 of 0 is = : (b) For a uniform prior distribution, show that the posterior density function 1(0|x) of the posterior distribution (0|x) is given by (nF + 1)! n! (n(T – 1))! 7(0|x) = O" (1 – 0)n(7-1). n+1 (c) Show that mean of the posterior distribution is E((0|r)) = %3D nI + 2 (d) Find the marimum a posteriori (MAP) estimate of 0, that is to say the value of 0 maximising ¤(0|x). (e) Find the ML estimate ô of 0, the MAP estimate of 0 and the posterior mean E((0|x)) of the posterior distribution (0|x) for the data x = (2,1, 5, 9, 12, 3, 1, 1, 8, 4) with n = 10 data points.
NOTE. For positive integers a and b, you may assume that (а- 1)! (ь — 1)! (а +b—1)! za-1(1 – 2)*-1 dz = 1. Consider a 6-sided dice in which a dice roll has a probability 0 of showing a 6 and a probability (1 – 0) of showing some other number (1, 2, 3, 4, 5), where 0 < 0 < 1. For a series of dice rolls, let x1,.., In be the number of dice rolls between successive 6s, so the dice roll sequence 1,6, 1,3, 5, 2, 6, ... would have r1 = 2, x2 = 5 and so on, with sample mean I = (r+...+In). %3D (a) Show that the likelihood function L(0; x) for the parameter 0 given = (x1,..., In) is L(0; x) = 0" (1 – 0)"(z-1), and show 1 the data x that the Maximum Likelihood (ML) estimate 0 of 0 is = : (b) For a uniform prior distribution, show that the posterior density function 1(0|x) of the posterior distribution (0|x) is given by (nF + 1)! n! (n(T – 1))! 7(0|x) = O" (1 – 0)n(7-1). n+1 (c) Show that mean of the posterior distribution is E((0|r)) = %3D nI + 2 (d) Find the marimum a posteriori (MAP) estimate of 0, that is to say the value of 0 maximising ¤(0|x). (e) Find the ML estimate ô of 0, the MAP estimate of 0 and the posterior mean E((0|x)) of the posterior distribution (0|x) for the data x = (2,1, 5, 9, 12, 3, 1, 1, 8, 4) with n = 10 data points.
MATLAB: An Introduction with Applications
6th Edition
ISBN:9781119256830
Author:Amos Gilat
Publisher:Amos Gilat
Chapter1: Starting With Matlab
Section: Chapter Questions
Problem 1P
Related questions
Question
100%
Bayesian Inference

Transcribed Image Text:NOTE. For positive integers a and b, you may assume that
| 2a-'(1 – 2)*-1 dz =
(а — 1)! (b — 1)!
(a + b – 1)!
1. Consider a 6-sided dice in which a dice roll has a probability 0 of
showing a 6 and a probability (1 – 0) of showing some other number
(1, 2, 3, 4, 5), where 0 < 0 < 1. For a series of dice rolls, let x1,..., Tn be
the number of dice rolls between successive 6s, so the dice roll sequence
1, 6, 1, 3, 5, 2, 6, ... would have xı = 2, x2 = 5 and so on, with sample
%3D
(*1 +...+ xn).
mean x
(a) Show that the likelihood function L(0; x) for the parameter 0 given
(*1,..., xn) is L(0; x) = 0" (1 – 0)"(u-1), and show
the data x =
1
that the Maximum Likelihood (ML) estimate 0 of 0 is 0
(b) For a uniform prior distribution, show that the posterior density
function 7(0|x) of the posterior distribution (0|x) is given by
(nF + 1)!
n! (n(T – 1))!
T(0|x) =
O" (1 – 0)n(z-1).
n +1
(c) Show that mean of the posterior distribution is E((0\x)) :
nữ + 2
(d) Find the marimum a posteriori (MAP) estimate ô of 0, that is to
say the value of 0 maximising 1(0|x).
(e) Find the ML estimate ô of 0, the MAP estimate of 0 and the
posterior mean E((0|x)) of the posterior distribution (0|x) for the
data x =
(2,1,5, 9, 12, 3, 1, 1, 8, 4) with n = 10 data points.
Expert Solution

This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
Step by step
Solved in 6 steps with 17 images

Recommended textbooks for you

MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc

Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning

Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning

MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc

Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning

Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning

Elementary Statistics: Picturing the World (7th E…
Statistics
ISBN:
9780134683416
Author:
Ron Larson, Betsy Farber
Publisher:
PEARSON

The Basic Practice of Statistics
Statistics
ISBN:
9781319042578
Author:
David S. Moore, William I. Notz, Michael A. Fligner
Publisher:
W. H. Freeman

Introduction to the Practice of Statistics
Statistics
ISBN:
9781319013387
Author:
David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:
W. H. Freeman