Suppose X₁,..., X₁, Z are independent N(µ, t−¹) random variables and the prior density of (T, μ) is -zª-¹e-Bt. (2π)-¹/2(kt)¹/2 exp +/+ (1 - 1)²} as in Example 3.3. Recall that in Example 3.3 we showed that the posterior density of (t, µ), given X₁, ..., Xn, is of the same form with parameters (a, b, k, v) replaced by (a', p', k', v'). Suppose now that the objective is prediction about Z, conditionally on X₁,..., X. Show that, if л(t, µ; α', ß', k', v') is the posterior density of (t, μ) and Z|(t, µ) ~ N(µ, -¹), then the joint posterior density of (T, Z) is of the form □(t, Z; α", ß", k", v") and state explicitly what are a", B", k", v". Show that the marginal predictive density of Z, given X₁,..., X, is л(т, μ;α, ß, k, v) = Ba r(a)
![3.15 Suppose X₁,..., X₁, Z are independent N(μ, T-¹) random variables and the prior
density of (T, ) is
л(τ, μ; α, ß, k, v) =
Ва
Γ(α)
kt
-zª-¹e-³² · (27)-1¹/2²(kt)¹/²2 exp{-KZ² (μ- v)²}
-BT
as in Example 3.3. Recall that in Example 3.3 we showed that the posterior density
of (T, μ), given X₁,..., Xn, is of the same form with parameters (α, ß, k, v) replaced
by (a', B', k', v').
Suppose now that the objective is prediction about Z, conditionally on
X₁,..., X. Show that, if л(t, μ; a', B', k', v') is the posterior density of (T, μ)
and Z|(t, μ)~ N(μ, T-¹), then the joint posterior density of (T, Z) is of the form
л(t, Z; α", ß", k", v") and state explicitly what are a", B", k", v".
Show that the marginal predictive density of Z, given X₁, ..., Xn, is
(B")"
Γ(α")
k" 1/2
(217)¹².
r(a" + ¹)
{B" + ½ k"(Z − v″)²}ª"+1/2
and interpret this in terms of the t distribution.](/v2/_next/image?url=https%3A%2F%2Fcontent.bartleby.com%2Fqna-images%2Fquestion%2Fe1946b62-ae76-40bd-9cad-08b76ff21489%2F3f4061d6-3953-4608-b3b4-b943f028f95c%2Fbydgeq_processed.png&w=3840&q=75)
![](/static/compass_v2/shared-icons/check-mark.png)
Given independent normal random variables X1, ..., Xn ~ N(u, T^-1), and a normal random variable Z ~ N(u, T^-1), and the prior density of (T, u) given by
π(T, u; a, B, k, v) = βα(a) T^(-a-1) (2π)^(-1/2) (kT)^(-1/2) exp{-B/2T}
where βα(a) = Γ(a)/[Γ(a/2)π^(1/2)], and Γ(a) is the gamma function.
The posterior density of (T, u) given X1, ..., Xn is given by:
π(T, u|X1, ..., Xn; a', B', k', v') = β''α''(a'') T^(-a''-1) (2π)^(-1/2) (k''T)^(-1/2) exp{-B''/2T}
where β''α''(a'') = Γ(a''/[Γ(a''/2)π^(1/2)], and a'', B'', k'', and v'' are updated parameters given by:
a'' = a + n/2 B'' = B + (Σ(xi - u)² + n/(1+n/2)(Σxi/n - u)²)/2 k'' = k + n/2 v'' = (kv + Σti)/(k+n)
Now, the joint posterior density of (T, Z) given X1, ..., Xn is proportional to the likelihood times the prior, that is:
π(T, Z|X1, ..., Xn) ∝ f(X1, ..., Xn, Z|T, u) π(T, u; a', B', k', v')
where f(X1, ..., Xn, Z|T, u) is the joint density of the observed data and Z given T and u, which is given by:
f(X1, ..., Xn, Z|T, u) = (2π)^(-n/2) T^(-n/2) exp{-[(Z - u)^2 + Σ(Xi - u)^2]/(2T)}
Substituting the expressions for the prior and likelihood into the joint posterior density and simplifying, we obtain:
π(T, Z|X1, ..., Xn) = β'''α'''(a''') T^(-a'''-1) (2π)^(-1/2) (k'''T)^(-1/2) exp{-B'''/2T}
where β'''α'''(a''') = Γ(a'''/2)/[Γ(a'''+1/2)π^(1/2)], and a''', B''', k''', and v''' are updated parameters given by:
a''' = a'' + 1/2 B''' = B'' + (Z - u)^2/2 + k''/[2(T + n)](Z - v'')^2 k''' = k'' + 1/2 v''' = (kv'' + Z)/(k''' + 1)
Trending now
This is a popular solution!
Step by step
Solved in 2 steps
![Blurred answer](/static/compass_v2/solution-images/blurred-answer.jpg)
![MATLAB: An Introduction with Applications](https://www.bartleby.com/isbn_cover_images/9781119256830/9781119256830_smallCoverImage.gif)
![Probability and Statistics for Engineering and th…](https://www.bartleby.com/isbn_cover_images/9781305251809/9781305251809_smallCoverImage.gif)
![Statistics for The Behavioral Sciences (MindTap C…](https://www.bartleby.com/isbn_cover_images/9781305504912/9781305504912_smallCoverImage.gif)
![MATLAB: An Introduction with Applications](https://www.bartleby.com/isbn_cover_images/9781119256830/9781119256830_smallCoverImage.gif)
![Probability and Statistics for Engineering and th…](https://www.bartleby.com/isbn_cover_images/9781305251809/9781305251809_smallCoverImage.gif)
![Statistics for The Behavioral Sciences (MindTap C…](https://www.bartleby.com/isbn_cover_images/9781305504912/9781305504912_smallCoverImage.gif)
![Elementary Statistics: Picturing the World (7th E…](https://www.bartleby.com/isbn_cover_images/9780134683416/9780134683416_smallCoverImage.gif)
![The Basic Practice of Statistics](https://www.bartleby.com/isbn_cover_images/9781319042578/9781319042578_smallCoverImage.gif)
![Introduction to the Practice of Statistics](https://www.bartleby.com/isbn_cover_images/9781319013387/9781319013387_smallCoverImage.gif)