Thomas' Calculus and Linear Algebra and Its Applications Package for the Georgia Institute of Technology, 1/e
5th Edition
ISBN: 9781323132098
Author: Thomas, Lay
Publisher: PEARSON C
expand_more
expand_more
format_list_bulleted
Textbook Question
Chapter 10.3, Problem 25E
The following set of webpages hyperlinked by the directed graph was studied in Section 10.2, Exercise 25.
Consider randomly surfing on this set of webpages using the Google matrix as the transition matrix.
- a. Show that this Markov chain is irreducible.
- b. Suppose the surfer starts at page 1. How many mouse clicks on average must the surfer make to get back to page 1?
In Exercises 25 and 26. consider a set of webpages hyperlinked by the given directed graph. Find the Google matrix for each graph and compute the PageRank of each page in the set.
25.
Expert Solution & Answer
Want to see the full answer?
Check out a sample textbook solutionStudents have asked these similar questions
Each item is inspected and is declared to either pass or fail. The machine can work in automatic or manual mode. If it outputs two failed items in a row in automatic mode, it is switched to manual. Once it produces two passing items in a row in manual mode, it is switched back to automatic. Sup- pose that failure rate is a in automatic and b in manual. You modeled the system as a Markov chain with a diagram given below, where states represent the mode and the status of the previously man- ufactured item, so for example, state “manual-1 success” represents that the machine is in manual mode and the previous item passed.
2. A professor either walks or drives to a university. He never drives two days in a row, but if he
walks one day, he is just as likely to walk the next day as to drive his car. Give the transition
matrix for this Markov chain.
Suppose a math professor collects data on the probability that students attending a given class
meeting will attend the next one. He finds that 95% of students who attended a given class meeting
will attend the following class meeting and that 25% of students who do not attend attend a given
class meeting will not attend the next one. Build a discrete dynamical system model using linear
algebra. Be sure to state your transition matrix explicitly. What percentage of students does your
model predict will be attending class meetings by the end of the semester (in the long run)?
Chapter 10 Solutions
Thomas' Calculus and Linear Algebra and Its Applications Package for the Georgia Institute of Technology, 1/e
Ch. 10.1 - Fill in the missing entries in the stochastic...Ch. 10.1 - Prob. 2PPCh. 10.1 - In Exercises 1 and 2, determine whether P is a...Ch. 10.1 - In Exercises 1 and 2, determine whether P is a...Ch. 10.1 - Prob. 3ECh. 10.1 - Prob. 4ECh. 10.1 - In Exercises 5 and 6, the transition matrix P for...Ch. 10.1 - Prob. 6ECh. 10.1 - In Exercises 7 and 8, the transition matrix P for...Ch. 10.1 - In Exercises 7 and 8, the transition matrix P for...
Ch. 10.1 - Consider a pair of Ehrenfest urns labeled A and B....Ch. 10.1 - Consider a pair of Ehrenfest urns labeled A and B....Ch. 10.1 - Consider an unbiased random walk on the set...Ch. 10.1 - Consider a biased random walk on the set {1,2,3,4}...Ch. 10.1 - In Exercises 13 and 14, find the transition matrix...Ch. 10.1 - In Exercises 13 and 14, find the transition matrix...Ch. 10.1 - In Exercises 15 and 16, find the transition matrix...Ch. 10.1 - In Exercises 15 and 16, find the transition matrix...Ch. 10.1 - The mouse is placed in room 2 of the maze shown...Ch. 10.1 - The mouse is placed in room 3 of the maze shown...Ch. 10.1 - Prob. 19ECh. 10.1 - In Exercises 19 and 20, suppose a mouse wanders...Ch. 10.1 - Prob. 21ECh. 10.1 - In Exercises 21 and 22, mark each statement True...Ch. 10.1 - The weather in Charlotte, North Carolina, can be...Ch. 10.1 - Suppose that whether it rains in Charlotte...Ch. 10.1 - Prob. 25ECh. 10.1 - Consider a set of five webpages hyperlinked by the...Ch. 10.1 - Consider a model for signal transmission in which...Ch. 10.1 - Consider a model for signal transmission in which...Ch. 10.1 - Prob. 29ECh. 10.1 - Another model for diffusion is called the...Ch. 10.1 - To win a game in tennis, one player must score...Ch. 10.1 - Volleyball uses two different scoring systems in...Ch. 10.1 - Prob. 33ECh. 10.2 - Consider the Markov chain on {1, 2, 3} with...Ch. 10.2 - In Exercises 1 and 2, consider a Markov chain on...Ch. 10.2 - Prob. 2ECh. 10.2 - In Exercises 3 and 4, consider a Markov chain on...Ch. 10.2 - Prob. 4ECh. 10.2 - Prob. 5ECh. 10.2 - In Exercises 5 and 6, find the matrix to which Pn...Ch. 10.2 - In Exercises 7 and 8, determine whether the given...Ch. 10.2 - Prob. 8ECh. 10.2 - Consider a pair of Ehrenfest urns with a total of...Ch. 10.2 - Consider a pair of Ehrenfest urns with a total of...Ch. 10.2 - Consider an unbiased random walk with reflecting...Ch. 10.2 - Consider a biased random walk with reflecting...Ch. 10.2 - Prob. 13ECh. 10.2 - In Exercises 13 and 14, consider a simple random...Ch. 10.2 - In Exercises 15 and 16, consider a simple random...Ch. 10.2 - In Exercises 15 and 16, consider a simple random...Ch. 10.2 - Prob. 17ECh. 10.2 - Prob. 18ECh. 10.2 - Prob. 19ECh. 10.2 - Consider the mouse in the following maze, which...Ch. 10.2 - In Exercises 21 and 22, mark each statement True...Ch. 10.2 - In Exercises 21 and 22, mark each statement True...Ch. 10.2 - Prob. 23ECh. 10.2 - Suppose that the weather in Charlotte is modeled...Ch. 10.2 - In Exercises 25 and 26, consider a set of webpages...Ch. 10.2 - In Exercises 25 and 26, consider a set of webpages...Ch. 10.2 - Prob. 27ECh. 10.2 - Consider beginning with an individual of known...Ch. 10.2 - Prob. 29ECh. 10.2 - Consider the Bernoulli-Laplace diffusion model...Ch. 10.2 - Prob. 31ECh. 10.2 - Prob. 32ECh. 10.2 - Prob. 33ECh. 10.2 - Let 0 p, q 1, and define P = [p1q1pq] a. Show...Ch. 10.2 - Let 0 p, q 1, and define P = [pq1pqq1pqp1pqpq]...Ch. 10.2 - Let A be an m m stochastic matrix, let x be in m...Ch. 10.2 - Prob. 37ECh. 10.2 - Consider a simple random walk on a finite...Ch. 10.2 - Prob. 39ECh. 10.3 - Consider the Markov chain on {1, 2, 3, 4} with...Ch. 10.3 - Prob. 1ECh. 10.3 - In Exercises 16, consider a Markov chain with...Ch. 10.3 - Prob. 3ECh. 10.3 - Prob. 4ECh. 10.3 - Prob. 5ECh. 10.3 - Prob. 6ECh. 10.3 - Consider the mouse in the following maze from...Ch. 10.3 - Prob. 8ECh. 10.3 - Prob. 9ECh. 10.3 - Prob. 10ECh. 10.3 - Prob. 11ECh. 10.3 - Consider an unbiased random walk with absorbing...Ch. 10.3 - In Exercises 13 and 14, consider a simple random...Ch. 10.3 - Prob. 14ECh. 10.3 - In Exercises 15 and 16, consider a simple random...Ch. 10.3 - In Exercises 15 and 16, consider a simple random...Ch. 10.3 - Consider the mouse in the following maze from...Ch. 10.3 - Consider the mouse in the following maze from...Ch. 10.3 - Prob. 19ECh. 10.3 - In Exercises 19 and 20, consider the mouse in the...Ch. 10.3 - Prob. 21ECh. 10.3 - Prob. 22ECh. 10.3 - Suppose that the weather in Charlotte is modeled...Ch. 10.3 - Prob. 24ECh. 10.3 - The following set of webpages hyperlinked by the...Ch. 10.3 - The following set of webpages hyperlinked by the...Ch. 10.3 - Prob. 27ECh. 10.3 - Prob. 28ECh. 10.3 - Prob. 29ECh. 10.3 - Prob. 30ECh. 10.3 - Prob. 31ECh. 10.3 - Prob. 32ECh. 10.3 - Prob. 33ECh. 10.3 - In Exercises 33 and 34, consider the Markov chain...Ch. 10.3 - Prob. 35ECh. 10.3 - Prob. 36ECh. 10.4 - Consider the Markov chain on {1, 2, 3, 4} with...Ch. 10.4 - In Exercises 1-6, consider a Markov chain with...Ch. 10.4 - In Exercises 1-6, consider a Markov chain with...Ch. 10.4 - In Exercises 1-6, consider a Markov chain with...Ch. 10.4 - In Exercises 1-6, consider a Markov chain with...Ch. 10.4 - In Exercises 1-6, consider a Markov chain with...Ch. 10.4 - In Exercises 1-6, consider a Markov chain with...Ch. 10.4 - In Exercises 7-10, consider a simple random walk...Ch. 10.4 - In Exercises 7-10, consider a simple random walk...Ch. 10.4 - In Exercises 7-10, consider a simple random walk...Ch. 10.4 - In Exercises 7-10: consider a simple random walk...Ch. 10.4 - Reorder the states in the Markov chain in Exercise...Ch. 10.4 - Reorder the states in the Markov chain in Exercise...Ch. 10.4 - Reorder the states in the Markov chain in Exercise...Ch. 10.4 - Prob. 14ECh. 10.4 - Prob. 15ECh. 10.4 - Prob. 16ECh. 10.4 - Find the transition matrix for the Markov chain in...Ch. 10.4 - Find the transition matrix for the Markov chain in...Ch. 10.4 - Consider the mouse in the following maze from...Ch. 10.4 - Consider the mouse in the following maze from...Ch. 10.4 - In Exercises 21-22, mark each statement True or...Ch. 10.4 - In Exercises 21-22, mark each statement True or...Ch. 10.4 - Confirm Theorem 5 for the Markov chain in Exercise...Ch. 10.4 - Prob. 24ECh. 10.4 - Consider the Markov chain on {1, 2, 3} with...Ch. 10.4 - Follow the plan of Exercise 25 to confirm Theorem...Ch. 10.4 - Prob. 27ECh. 10.4 - Prob. 28ECh. 10.4 - Prob. 29ECh. 10.5 - Prob. 1PPCh. 10.5 - Consider a Markov chain on {1, 2, 3, 4} with...Ch. 10.5 - Prob. 1ECh. 10.5 - Prob. 2ECh. 10.5 - In Exercises 13, find the fundamental matrix of...Ch. 10.5 - Prob. 4ECh. 10.5 - Prob. 5ECh. 10.5 - Prob. 6ECh. 10.5 - Prob. 7ECh. 10.5 - Prob. 8ECh. 10.5 - Prob. 9ECh. 10.5 - Prob. 10ECh. 10.5 - Prob. 11ECh. 10.5 - Prob. 12ECh. 10.5 - Consider a simple random walk on the following...Ch. 10.5 - Consider a simple random walk on the following...Ch. 10.5 - Prob. 15ECh. 10.5 - Prob. 16ECh. 10.5 - Prob. 17ECh. 10.5 - Prob. 18ECh. 10.5 - Prob. 19ECh. 10.5 - Consider the mouse in the following maze from...Ch. 10.5 - In Exercises 21 and 22, mark each statement True...Ch. 10.5 - Prob. 22ECh. 10.5 - Suppose that the weather in Charlotte is modeled...Ch. 10.5 - Suppose that the weather in Charlotte is modeled...Ch. 10.5 - Consider a set of webpages hyperlinked by the...Ch. 10.5 - Consider a set of webpages hyperlinked by the...Ch. 10.5 - Exercises 27-30 concern the Markov chain model for...Ch. 10.5 - Exercises 27-30 concern the Markov chain model for...Ch. 10.5 - Exercises 27-30 concern the Markov chain model for...Ch. 10.5 - Exercises 27-30 concern the Markov chain model for...Ch. 10.5 - Exercises 31-36 concern the two Markov chain...Ch. 10.5 - Exercises 31-36 concern the two Markov chain...Ch. 10.5 - Exercises 31-36 concern the two Markov chain...Ch. 10.5 - Prob. 34ECh. 10.5 - Prob. 35ECh. 10.5 - Prob. 36ECh. 10.5 - Consider a Markov chain on {1, 2, 3, 4, 5, 6} with...Ch. 10.5 - Consider a Markov chain on {1,2,3,4,5,6} with...Ch. 10.5 - Prob. 39ECh. 10.6 - Let A be the matrix just before Example 1. Explain...Ch. 10.6 - Prob. 2PPCh. 10.6 - Prob. 1ECh. 10.6 - Prob. 2ECh. 10.6 - Prob. 3ECh. 10.6 - Prob. 4ECh. 10.6 - Prob. 5ECh. 10.6 - Prob. 6ECh. 10.6 - Major League batting statistics for the 2006...Ch. 10.6 - Prob. 8ECh. 10.6 - Prob. 9ECh. 10.6 - Prob. 10ECh. 10.6 - Prob. 11ECh. 10.6 - Prob. 12ECh. 10.6 - Prob. 14ECh. 10.6 - Prob. 15ECh. 10.6 - Prob. 16ECh. 10.6 - Prob. 17ECh. 10.6 - In the previous exercise, let p be the probability...
Knowledge Booster
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, algebra and related others by exploring similar questions and additional content below.Similar questions
- Explain how you can determine the steady state matrix X of an absorbing Markov chain by inspection.arrow_forward12. Robots have been programmed to traverse the maze shown in Figure 3.28 and at each junction randomly choose which way to go. Figure 3.28 (a) Construct the transition matrix for the Markov chain that models this situation. (b) Suppose we start with 15 robots at each junction. Find the steady state distribution of robots. (Assume that it takes each robot the same amount of time to travel between two adjacent junctions.)arrow_forwardA state vectorX for a three-state Markov chain is such that the system is as likely to be in state 3 as in state 2 and is three times as likely to be in state 1 as in 3. Find the state vector X.arrow_forward
- Please describe the steps you used to get the solution to the problem provided in the image below.arrow_forwardA garage has 2 mechanics and 5 service bays to house vehicles. Customers arrive on average every 3 hours, and each mechanic works independently, taking an average of 2 hours to complete a service (all times exponentially distributed). Vehicles that cannot find a free service bay leave. A. The garage is modeled as a Markov Chain with no distinction between service bays, i.e., state space only tracks number of used bays (customers in garage). Draw the resultant state transition diagram along with inter-state transition rates. How many states are there in total?arrow_forwardConsider a Markov random process whose state transition diagram is shown in figure below. Write the state transition matrix for the Markov process whose corresponding state transition diagram is shown in the above figure. List the pairs of communicating states. List 2 pairs of accessible states and 2 pairs of inaccessible states. List all the transient states List all the recurrent states Identify the classes of the Markov chain and list the closed and non-closed Find P[X2 = 2 | X1 = 3] Find P[X6 = 4, X5 = 3, X4 = 2, X3 =3, X2 = 2, X1 = 1, X0 =1]. where Xt denotes the state of the random process at time instant t. The initial probability distribution is given by X0 = [2/3 0 0 0 0 0 1/3]. KIndly request you to refer to the screenshot for the figurearrow_forward
- a) Suppose that whether or not it rains, today edpends on previous weather conditions through the last three days. Show how this system may be analyzed by using a Markov chain. How many states are needed? b) Suppose that if it has rained for the past three days, then it will rain today with probability 0.8; if it did not rain for any of the past three days, then it will rain today with probability 0.2; and in any other case the weather today will, with probibility 0.6, be the same as the weather yesterday. Determine P for this Markov chain.arrow_forwardPlease help me complete the full question, I will give you the upvote. Thanks!arrow_forwardCan someone please help me with this question. I am having so much trouble.arrow_forward
- A car rental company has two locations. Each week, 80% of the cars rented at location A are returned to location A and the rest location B. Of the cars rented at location B, 30% are returned to location B by the end of the week and the rest to location A. a. Make a transition diagram for this process. b. Write the transition matrix T for this process. cIf 50% of the company's cars start this week at location A, and 50% at location B, find the proportion of cars that will be at each location one week later Label your answers. d. Write and solve a system of equations to find the stable distribution (correct to 3 decimal places) for this Markov process. Show all calculations and label the row operations.arrow_forwardA factory worker will quit with probability 1/2 during her first month, with probability 1/4 during her second month and with probability 1/8 after that. Whenever someone quits, their replacement will start at the beginning of the next month. Model the status of each position as a Markov chain with 3 states. Identify the states and transition matrix. Write down the system of equations determining the long-run proportions. Suppose there are 900 workers in the factory. Find the average number of the workers who have been there for more than 2 months.arrow_forwardModems networked to a mainframe computer system have a limited capacity. is the probability that a user dials into the network when a modem connection is available, and 1/4 is the probability that a call is received when all lines are busy. The system can be considered as a binary Markov chain. Draw the state transition diagram of the Markov chain. i) ii) iii) Find the state transition matrix and the probability state vector p (k). Describe the steady-state behaviour of the system, i.e., find the vector (For a binary Markov chain, =B₁² a) + + pk =a+B \B 1 [3²] ) В (1-α-B) k | α a+ßarrow_forward
arrow_back_ios
arrow_forward_ios
Recommended textbooks for you
- Linear Algebra: A Modern IntroductionAlgebraISBN:9781285463247Author:David PoolePublisher:Cengage LearningElementary Linear Algebra (MindTap Course List)AlgebraISBN:9781305658004Author:Ron LarsonPublisher:Cengage Learning
Linear Algebra: A Modern Introduction
Algebra
ISBN:9781285463247
Author:David Poole
Publisher:Cengage Learning
Elementary Linear Algebra (MindTap Course List)
Algebra
ISBN:9781305658004
Author:Ron Larson
Publisher:Cengage Learning
Finite Math: Markov Chain Example - The Gambler's Ruin; Author: Brandon Foltz;https://www.youtube.com/watch?v=afIhgiHVnj0;License: Standard YouTube License, CC-BY
Introduction: MARKOV PROCESS And MARKOV CHAINS // Short Lecture // Linear Algebra; Author: AfterMath;https://www.youtube.com/watch?v=qK-PUTuUSpw;License: Standard Youtube License
Stochastic process and Markov Chain Model | Transition Probability Matrix (TPM); Author: Dr. Harish Garg;https://www.youtube.com/watch?v=sb4jo4P4ZLI;License: Standard YouTube License, CC-BY