3. Let X = {Xk, k = N} be a Markov chain with state-space S = {1, 2,...,m}. The real- izations of the process are observed throughtime k = 0, 1, ..., n, with n < ∞. From the realizations of the sample path, the following observations are recorded n-1 Nij = Σ 1 [X₁ =i‚Xx+1=j} = # (i → j) k=0 m n N₁ = Σ Nij = Σ 1 (X₁ = i)} = #transitions out of state i j=1 k=1 Using these observations, we want to estimate the transition matrix P of the Markov chain. For this purpose, let ik Є S be a state that X occupies in step k, i.e., Xk = ik, k = {0,..., n}. The likelihood function of the realizations (Xo, ..., ✗} of X after n steps reads L= P(X = in, Xn-1 = in-1,..., Xo = io). (a) Let P(Xo = io) = = P(X1 = 1, i.e., X starts in state io with probability one. Let Pij jXoi) be the (i, j)element of transition matrix P. Show that (2) simplifies to = n-1 L =P(X₁ = in|X。 = in−1) × P(X₁ = in−1|X0 = in-2) × ……. P(X₁ = i₁|Xo = io) = [ ] P (b) Given that ik, ik+1 Є S, the likelihood reduces to m m ²-ÛÛ ПП Piy). L = i=1 j=1 Derive the loglikelihood function of the realizations. (c) Show that the maximum likelihood estimate Pij of j Pij is given by k=0 Nij Pij = Ni (d) Show that Pij is a consistent estimator of Pij, i.e., n→∞ Pij Pij. (e) A simulation study of X on state space S = {1, 2, 3} yields the following realization of the sample path $path: 3 1 2 1 3 2 1 2 1 2 1 2 1 2 1 3 1 3 1 2 3 2 3 2 3 2 1 3 1 2 123 13 12 12 Use this realization to estimate the transition matrix P.
3. Let X = {Xk, k = N} be a Markov chain with state-space S = {1, 2,...,m}. The real- izations of the process are observed throughtime k = 0, 1, ..., n, with n < ∞. From the realizations of the sample path, the following observations are recorded n-1 Nij = Σ 1 [X₁ =i‚Xx+1=j} = # (i → j) k=0 m n N₁ = Σ Nij = Σ 1 (X₁ = i)} = #transitions out of state i j=1 k=1 Using these observations, we want to estimate the transition matrix P of the Markov chain. For this purpose, let ik Є S be a state that X occupies in step k, i.e., Xk = ik, k = {0,..., n}. The likelihood function of the realizations (Xo, ..., ✗} of X after n steps reads L= P(X = in, Xn-1 = in-1,..., Xo = io). (a) Let P(Xo = io) = = P(X1 = 1, i.e., X starts in state io with probability one. Let Pij jXoi) be the (i, j)element of transition matrix P. Show that (2) simplifies to = n-1 L =P(X₁ = in|X。 = in−1) × P(X₁ = in−1|X0 = in-2) × ……. P(X₁ = i₁|Xo = io) = [ ] P (b) Given that ik, ik+1 Є S, the likelihood reduces to m m ²-ÛÛ ПП Piy). L = i=1 j=1 Derive the loglikelihood function of the realizations. (c) Show that the maximum likelihood estimate Pij of j Pij is given by k=0 Nij Pij = Ni (d) Show that Pij is a consistent estimator of Pij, i.e., n→∞ Pij Pij. (e) A simulation study of X on state space S = {1, 2, 3} yields the following realization of the sample path $path: 3 1 2 1 3 2 1 2 1 2 1 2 1 2 1 3 1 3 1 2 3 2 3 2 3 2 1 3 1 2 123 13 12 12 Use this realization to estimate the transition matrix P.
A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
Related questions
Question
Please do the questions with handwritten working. I'm struggling to understand what to write
Expert Solution
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
Step by step
Solved in 2 steps
Recommended textbooks for you
A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON
A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON