Bob and Doug play a lot of Ping-Pong, but Doug is a much better player, and wins 80% of their games. To make up for this, if Doug wins a game he will spot Bob five points in their next game. If Doug wins again he will spot Bob ten points the next game, and if he still wins the next game he will spot him fifteen points, and continue to spot him fifteen points as long as he keeps winning. Whenever Bob wins a game he goes back to playing the next game with no
Bob and Doug play a lot of Ping-Pong, but Doug is a much better player, and wins 80% of their games.
To make up for this, if Doug wins a game he will spot Bob five points in their next game. If Doug wins again he will spot Bob ten points the next game, and if he still wins the next game he will spot him fifteen points, and continue to spot him fifteen points as long as he keeps winning. Whenever Bob wins a game he goes back to playing the next game with no advantage.
It turns out that with a five-point advantage Bob wins 20% of the time; he wins 50% of the time with a ten-point advantage and 80% of the time with a fifteen-point advantage.
Model this situation as a Markov chain using the number of consecutive games won by Doug as the states. There should be four states representing zero, one, two, and three or more consecutive games won by Doug. Find the transition matrix of this system, the steady-state
Cannot understand how can i form the matrice in these type of questions
please help
Trending now
This is a popular solution!
Step by step
Solved in 3 steps with 3 images