A probability transition matrix P on a state space S is called doubly stochas- tic if its column sums are all 1, i.e., Σies P(i, j) = 1 for every j Є S. (a) If S is finite and P is doubly stochastic, show that all states of the Markov chain are positive recurrent. (b) If, in addition to the assumptions in part (a), P is also irreducible and aperiodic, then deduce that 1 Pn (i, j) → |S| as n→ ∞. What does this tell you about equilibrium distributions of the Markov chain? (c) If S is countably infinite and P is an irreducible, doubly stochastic transition matrix, show that either all states are null-recurrent or all states are transient. What does this tell you about equilibrium distributions of the Markov chain?

Advanced Engineering Mathematics
10th Edition
ISBN:9780470458365
Author:Erwin Kreyszig
Publisher:Erwin Kreyszig
Chapter2: Second-order Linear Odes
Section: Chapter Questions
Problem 1RQ
icon
Related questions
Question
A probability transition matrix P on a state space S is called doubly stochas-
tic if its column sums are all 1, i.e., Σies P(i, j) = 1 for every j Є S.
(a) If S is finite and P is doubly stochastic, show that all states of the Markov chain
are positive recurrent.
(b) If, in addition to the assumptions in part (a), P is also irreducible and aperiodic,
then deduce that
1
Pn (i, j) →
|S|
as n→ ∞. What does this tell you about equilibrium distributions of the Markov
chain?
(c) If S is countably infinite and P is an irreducible, doubly stochastic transition
matrix, show that either all states are null-recurrent or all states are transient.
What does this tell you about equilibrium distributions of the Markov chain?
Transcribed Image Text:A probability transition matrix P on a state space S is called doubly stochas- tic if its column sums are all 1, i.e., Σies P(i, j) = 1 for every j Є S. (a) If S is finite and P is doubly stochastic, show that all states of the Markov chain are positive recurrent. (b) If, in addition to the assumptions in part (a), P is also irreducible and aperiodic, then deduce that 1 Pn (i, j) → |S| as n→ ∞. What does this tell you about equilibrium distributions of the Markov chain? (c) If S is countably infinite and P is an irreducible, doubly stochastic transition matrix, show that either all states are null-recurrent or all states are transient. What does this tell you about equilibrium distributions of the Markov chain?
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 2 steps

Blurred answer
Similar questions
  • SEE MORE QUESTIONS
Recommended textbooks for you
Advanced Engineering Mathematics
Advanced Engineering Mathematics
Advanced Math
ISBN:
9780470458365
Author:
Erwin Kreyszig
Publisher:
Wiley, John & Sons, Incorporated
Numerical Methods for Engineers
Numerical Methods for Engineers
Advanced Math
ISBN:
9780073397924
Author:
Steven C. Chapra Dr., Raymond P. Canale
Publisher:
McGraw-Hill Education
Introductory Mathematics for Engineering Applicat…
Introductory Mathematics for Engineering Applicat…
Advanced Math
ISBN:
9781118141809
Author:
Nathan Klingbeil
Publisher:
WILEY
Mathematics For Machine Technology
Mathematics For Machine Technology
Advanced Math
ISBN:
9781337798310
Author:
Peterson, John.
Publisher:
Cengage Learning,
Basic Technical Mathematics
Basic Technical Mathematics
Advanced Math
ISBN:
9780134437705
Author:
Washington
Publisher:
PEARSON
Topology
Topology
Advanced Math
ISBN:
9780134689517
Author:
Munkres, James R.
Publisher:
Pearson,