Let the observed data be y = (6, 4, 9, 2, 0, 3), a random sample from the Poisson distribution with mean A, where A> 0 is unknown. Suppose that we assume a Gamma(1, 1) prior distribution for A. The posterior density, p(A | y), for λ is Gamma (1+ S, 1 + n), where S = ₁ and n = 6. Suppose that you want to construct a symmetric Metropolis-Hastings on the log-scale to generate a sample from this posterior distribution by using a normal proposal distribution with standard deviation b = 0.2. (a) Write down the steps in this symmetric Metropolis-Hastings (on the log-scale) to simulate realisations from the posterior density p(x|y). (b) Implement the algorithm in R. and plot the observations as a function of the iter- ations. Use M = 5000 for the number of iterations. (c) To assess the accuracy compare the empirical distribution of the sample with the exact posterior density, Gamma(1 + S, 1 + n). (d) Rerun the algorithm in R using a smaller b = 0.01 and a larger b = 20. What are the effects on the behaviour of the algorithm of making b smaller? What are the
Let the observed data be y = (6, 4, 9, 2, 0, 3), a random sample from the Poisson distribution with mean A, where A> 0 is unknown. Suppose that we assume a Gamma(1, 1) prior distribution for A. The posterior density, p(A | y), for λ is Gamma (1+ S, 1 + n), where S = ₁ and n = 6. Suppose that you want to construct a symmetric Metropolis-Hastings on the log-scale to generate a sample from this posterior distribution by using a normal proposal distribution with standard deviation b = 0.2. (a) Write down the steps in this symmetric Metropolis-Hastings (on the log-scale) to simulate realisations from the posterior density p(x|y). (b) Implement the algorithm in R. and plot the observations as a function of the iter- ations. Use M = 5000 for the number of iterations. (c) To assess the accuracy compare the empirical distribution of the sample with the exact posterior density, Gamma(1 + S, 1 + n). (d) Rerun the algorithm in R using a smaller b = 0.01 and a larger b = 20. What are the effects on the behaviour of the algorithm of making b smaller? What are the
MATLAB: An Introduction with Applications
6th Edition
ISBN:9781119256830
Author:Amos Gilat
Publisher:Amos Gilat
Chapter1: Starting With Matlab
Section: Chapter Questions
Problem 1P
Related questions
Question
all parts please

Transcribed Image Text:Let the observed data be y = (6, 4, 9, 2, 0, 3), a random sample from the
Poisson distribution with mean A, where A> 0 is unknown. Suppose that we assume a
Gamma(1, 1) prior distribution for A. The posterior density, p(A | y), for A is Gamma(1+
S, 1 + n), where S = ₁₁y₁ and n = 6. Suppose that you want to construct a
symmetric Metropolis-Hastings on the log-scale to generate a sample from this posterior
distribution by using a normal proposal distribution with standard deviation b = 0.2.
(a) Write down the steps in this symmetric Metropolis-Hastings (on the log-scale) to
simulate realisations from the posterior density p(x|y).
(b) Implement the algorithm in R. and plot the observations as a function of the iter-
ations. Use M = 5000 for the number of iterations.
(c) To assess the accuracy compare the empirical distribution of the sample with the
exact posterior density, Gamma(1 + S, 1 + n).
(d) Rerun the algorithm in R using a smaller b = 0.01 and a larger b = 20. What are
the effects on the behaviour of the algorithm of making b smaller? What are the
effects of making it larger?
(e) Add code to count how many times the proposed value for À was accepted. Rerun
the algorithm using values of b = 0.01, b = 0.2 and b = 20, and each time calculate
the proportion of steps that were accepted. Then plot this acceptance probability
against b. Examine how the acceptance probability for this algorithm depends b.
Expert Solution

This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
Step 1: Given information
VIEWStep 2: (a) Writing the steps to simulate the steps in the symmetric Metropolis-Hastings on the log scale
VIEWStep 3: (b) Implementing the algorithm using R
VIEWStep 4: (c) Comparison of the empirical distribution of the sample with the exact posterior density
VIEWStep 5: (d) Running Metropolis Hastings algorithm with different proposal standard deviations
VIEWStep 6: (e) Analysis of acceptance probability in Metropolis Hastings algorithm across different proposals
VIEWSolution
VIEWStep by step
Solved in 7 steps with 36 images

Recommended textbooks for you

MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc

Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning

Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning

MATLAB: An Introduction with Applications
Statistics
ISBN:
9781119256830
Author:
Amos Gilat
Publisher:
John Wiley & Sons Inc

Probability and Statistics for Engineering and th…
Statistics
ISBN:
9781305251809
Author:
Jay L. Devore
Publisher:
Cengage Learning

Statistics for The Behavioral Sciences (MindTap C…
Statistics
ISBN:
9781305504912
Author:
Frederick J Gravetter, Larry B. Wallnau
Publisher:
Cengage Learning

Elementary Statistics: Picturing the World (7th E…
Statistics
ISBN:
9780134683416
Author:
Ron Larson, Betsy Farber
Publisher:
PEARSON

The Basic Practice of Statistics
Statistics
ISBN:
9781319042578
Author:
David S. Moore, William I. Notz, Michael A. Fligner
Publisher:
W. H. Freeman

Introduction to the Practice of Statistics
Statistics
ISBN:
9781319013387
Author:
David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:
W. H. Freeman