Finite Mathematics for the Managerial, Life, and Social Sciences
12th Edition
ISBN: 9781337405782
Author: Soo T. Tan
Publisher: Cengage Learning
expand_more
expand_more
format_list_bulleted
Textbook Question
Chapter 9.1, Problem 29E
In Exercises
A Markov chain is a process in which the outcomes at any stage of the experiment depend on the outcomes of the preceding stages.
Expert Solution & Answer
Want to see the full answer?
Check out a sample textbook solutionStudents have asked these similar questions
can you please do part d and e , please provide explanations
Consider a game where you need to fill a 5-digit number by spinning a wheel. Every time you spin the wheel a
random digit shows up between 0 – 9 with equal probability. You are to decide where to place the digit in your
number. Once you choose a location for the digit you cannot change it. Your objective is to maximize the number
you get.
(a) Set this up as a Markov Decision Process, identify the states, actions, and transition matrices.
Anne and Barry take turns rolling a pair of dice, with Anne going first. Anne’s goal is to obtain a sum of 3, while Barry’s goal is to obtain a sum of 4. The game ends when either player reaches his goal,and the one reaching the goal is the winner.
Define a Markov Chain to model the problem.
Chapter 9 Solutions
Finite Mathematics for the Managerial, Life, and Social Sciences
Ch. 9.1 - What is a finite stochastic process? What can you...Ch. 9.1 - Prob. 2CQCh. 9.1 - Consider a transition matrix T for a Markov chain...Ch. 9.1 - Prob. 1ECh. 9.1 - Prob. 2ECh. 9.1 - Prob. 3ECh. 9.1 - Prob. 4ECh. 9.1 - Prob. 5ECh. 9.1 - Prob. 6ECh. 9.1 - Prob. 7E
Ch. 9.1 - Prob. 8ECh. 9.1 - Prob. 9ECh. 9.1 - In Exercises 1-10, determine which of the matrices...Ch. 9.1 - Prob. 11ECh. 9.1 - Prob. 12ECh. 9.1 - Prob. 13ECh. 9.1 - Prob. 14ECh. 9.1 - Prob. 15ECh. 9.1 - In Exercises 1518, find X2 the probability...Ch. 9.1 - Prob. 17ECh. 9.1 - Prob. 18ECh. 9.1 - Prob. 19ECh. 9.1 - Prob. 20ECh. 9.1 - Political Polls: Morris Polling conducted a poll 6...Ch. 9.1 - Commuter Trends: In a large metropolitan area, 20...Ch. 9.1 - Prob. 23ECh. 9.1 - Prob. 24ECh. 9.1 - Prob. 25ECh. 9.1 - MARKET SHARE OF AUTO MANUFACTURERES In a study of...Ch. 9.1 - Prob. 27ECh. 9.1 - Homeowners choice of Energy: A study conducted by...Ch. 9.1 - In Exercises 29 and 30, determine whether the...Ch. 9.1 - Prob. 30ECh. 9.1 - Prob. 1TECh. 9.1 - Prob. 2TECh. 9.1 - Prob. 3TECh. 9.1 - Prob. 4TECh. 9.2 - What is a A steady state distribution vector, b a...Ch. 9.2 - Prob. 2CQCh. 9.2 - Prob. 1ECh. 9.2 - Prob. 2ECh. 9.2 - Prob. 3ECh. 9.2 - Prob. 4ECh. 9.2 - Prob. 5ECh. 9.2 - Prob. 6ECh. 9.2 - Prob. 7ECh. 9.2 - Prob. 8ECh. 9.2 - Prob. 9ECh. 9.2 - Prob. 10ECh. 9.2 - Prob. 11ECh. 9.2 - Prob. 12ECh. 9.2 - Prob. 13ECh. 9.2 - Prob. 14ECh. 9.2 - Prob. 15ECh. 9.2 - Prob. 16ECh. 9.2 - Prob. 17ECh. 9.2 - COMMUTER TRENDS Within a large metropolitan area,...Ch. 9.2 - Prob. 19ECh. 9.2 - PROFESSIONAL WOMEN From data compiled over a...Ch. 9.2 - Prob. 21ECh. 9.2 - HOMEOWNERS' CHOICE OF ENERGY A study conducted by...Ch. 9.2 - NETWORK NEWS VIEWERSHIP A television poll was...Ch. 9.2 - Prob. 24ECh. 9.2 - GENETICS In a certain species of roses, a plant...Ch. 9.2 - Prob. 26ECh. 9.2 - Prob. 27ECh. 9.2 - Prob. 28ECh. 9.2 - Prob. 29ECh. 9.2 - Prob. 1TECh. 9.2 - Prob. 2TECh. 9.2 - Prob. 3TECh. 9.3 - What is an absorbing stochastic matrix?Ch. 9.3 - Prob. 2CQCh. 9.3 - Prob. 1ECh. 9.3 - Prob. 2ECh. 9.3 - Prob. 3ECh. 9.3 - Prob. 4ECh. 9.3 - Prob. 5ECh. 9.3 - Prob. 6ECh. 9.3 - Prob. 7ECh. 9.3 - Prob. 8ECh. 9.3 - Prob. 9ECh. 9.3 - Prob. 10ECh. 9.3 - Prob. 11ECh. 9.3 - In Exercises 9-14, rewrite each absorbing...Ch. 9.3 - Prob. 13ECh. 9.3 - Prob. 14ECh. 9.3 - Prob. 15ECh. 9.3 - Prob. 16ECh. 9.3 - Prob. 17ECh. 9.3 - Prob. 18ECh. 9.3 - Prob. 19ECh. 9.3 - Prob. 20ECh. 9.3 - Prob. 21ECh. 9.3 - Prob. 22ECh. 9.3 - Prob. 23ECh. 9.3 - Prob. 24ECh. 9.3 - Prob. 25ECh. 9.3 - Prob. 26ECh. 9.3 - GAME OF CHANCE Refer to Exercise 26. Suppose Diane...Ch. 9.3 - Prob. 28ECh. 9.3 - COLLEGE GRADUATION RATE: The registrar of...Ch. 9.3 - Prob. 30ECh. 9.3 - GENETICS Refer to Example 4. If the offspring are...Ch. 9.3 - Prob. 32ECh. 9.3 - Prob. 33ECh. 9.4 - a. What is the maximin strategy for the row player...Ch. 9.4 - Prob. 2CQCh. 9.4 - Prob. 1ECh. 9.4 - In Exercises 1-8, determine the maximin and...Ch. 9.4 - In Exercises 1-8, determine the maximin and...Ch. 9.4 - Prob. 4ECh. 9.4 - Prob. 5ECh. 9.4 - In Exercises 1-8, determine the maximin and...Ch. 9.4 - Prob. 7ECh. 9.4 - Prob. 8ECh. 9.4 - Prob. 9ECh. 9.4 - In Exercises 9-18, determine whether the...Ch. 9.4 - In Exercises 9-18, determine whether the...Ch. 9.4 - Prob. 12ECh. 9.4 - Prob. 13ECh. 9.4 - Prob. 14ECh. 9.4 - Prob. 15ECh. 9.4 - Prob. 16ECh. 9.4 - Prob. 17ECh. 9.4 - Prob. 18ECh. 9.4 - GAME OF MATCHING FINGERS Robin and Cathy play a...Ch. 9.4 - Prob. 20ECh. 9.4 - Prob. 21ECh. 9.4 - Prob. 22ECh. 9.4 - MARKET SHARE: Rolands Barber Shop and Charleys...Ch. 9.4 - In Exercises 24-26, determine whether the...Ch. 9.4 - Prob. 25ECh. 9.4 - Prob. 26ECh. 9.5 - Prob. 1CQCh. 9.5 - Prob. 2CQCh. 9.5 - Prob. 1ECh. 9.5 - Prob. 2ECh. 9.5 - Prob. 3ECh. 9.5 - Prob. 4ECh. 9.5 - In Exercises 1-6, the payoff matrix and strategies...Ch. 9.5 - Prob. 6ECh. 9.5 - Prob. 7ECh. 9.5 - Prob. 8ECh. 9.5 - The payoff matrix for a game is [332311121] a....Ch. 9.5 - The payoff matrix for a game is [423422352] a....Ch. 9.5 - Prob. 11ECh. 9.5 - Prob. 12ECh. 9.5 - In Exercises 11-16, find the optimal strategies, P...Ch. 9.5 - Prob. 14ECh. 9.5 - Prob. 15ECh. 9.5 - Prob. 16ECh. 9.5 - COIN-MATCHING GAME Consider the coin-matching game...Ch. 9.5 - INVESTMENT STRATEGIES As part of their investment...Ch. 9.5 - INVESTMENT STRATEGIES The Maxwells have decided to...Ch. 9.5 - CAMPAIGN STRATEGIES Bella Robinson and Steve...Ch. 9.5 - MARKETING STRATEGIES Two dentists, Lydia Russell...Ch. 9.5 - Prob. 22ECh. 9.5 - Prob. 23ECh. 9.CRQ - Prob. 1CRQCh. 9.CRQ - Prob. 2CRQCh. 9.CRQ - Fill in the blanks. The probabilities in a Markov...Ch. 9.CRQ - Fill in the blanks. A transition matrix associated...Ch. 9.CRQ - Prob. 5CRQCh. 9.CRQ - Prob. 6CRQCh. 9.CRQ - Prob. 7CRQCh. 9.CRQ - Prob. 8CRQCh. 9.CRQ - Prob. 9CRQCh. 9.CRQ - Prob. 10CRQCh. 9.CRE - Prob. 1CRECh. 9.CRE - Prob. 2CRECh. 9.CRE - Prob. 3CRECh. 9.CRE - Prob. 4CRECh. 9.CRE - Prob. 5CRECh. 9.CRE - Prob. 6CRECh. 9.CRE - In Exercises 7-10, determine whether the matrix is...Ch. 9.CRE - Prob. 8CRECh. 9.CRE - Prob. 9CRECh. 9.CRE - Prob. 10CRECh. 9.CRE - In Exercises 11-14, find the steady-state matrix...Ch. 9.CRE - Prob. 12CRECh. 9.CRE - Prob. 13CRECh. 9.CRE - Prob. 14CRECh. 9.CRE - Prob. 15CRECh. 9.CRE - Prob. 16CRECh. 9.CRE - Prob. 17CRECh. 9.CRE - Prob. 18CRECh. 9.CRE - Prob. 19CRECh. 9.CRE - Prob. 20CRECh. 9.CRE - Prob. 21CRECh. 9.CRE - Prob. 22CRECh. 9.CRE - Prob. 23CRECh. 9.CRE - Prob. 24CRECh. 9.CRE - Prob. 25CRECh. 9.CRE - Prob. 26CRECh. 9.CRE - Prob. 27CRECh. 9.CRE - Prob. 28CRECh. 9.CRE - Prob. 29CRECh. 9.CRE - OPTIMIZING DEMAND The management of a divison of...Ch. 9.BMO - The transition matrix for a Markov process is...Ch. 9.BMO - Prob. 2BMOCh. 9.BMO - Prob. 3BMOCh. 9.BMO - Prob. 4BMOCh. 9.BMO - The payoff matrix for a certain game is A=[213234]...Ch. 9.BMO - Prob. 6BMO
Knowledge Booster
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, subject and related others by exploring similar questions and additional content below.Similar questions
- Draw the state diagram for the Markov Model and show the transition probabilities on the diagram.arrow_forwardNick takes half-court shots on a basketball court. He is a streaky shooter, so his shot outcomes are not independent. If Nick made his last shot, then he makes his current one with probability a. If Nick missed his last shot, then he makes his current one with probability b, where b < a. Modeling Nick’s sequence of half-court shot outcomes as a Markov chain, what is the long-run probability that he makes a half-court shot?arrow_forwardI. Markov Chains A Markov chain (or process) is one in which future outcomes are determined by a current state. Future outcomes are based on probabilities. The probability of moving to a certain state depends only on the state previously occupied and does not vary with time. An example of a Markov chain is the maximum education achieved by children based on the highest educational level attained by their parents, where the states are (1) earned college degree, (2) high school diploma only, (3) elementary school only. If pj is the probability of moving from state i to state j, the transition matrix is the m × m matrix Pi1 P12 Pim ... P = LPm1 Pm2 Pmm, ...arrow_forward
- Shakira's concerts behave like a Markov chain. If the current concert gets cancelled, then there is an 90% chance that the next concert will be cancelled also. However, if the current concert does not get cancelled, then there is only a 50% chance that the next concert will be cancelled. What is the long-run probability that a concert will not be cancelled? a. 1/4 b. 1/10 c. 1/6 d. 1/2 e. 5/6 f. None of the others are correctarrow_forwardShakira's concerts behave like a Markov chain. If the current concert gets cancelled, then there is an 80% chance that the next concert will be cancelled also. However, if the current concert does not get cancelled, then there is only a 60% chance that the next concert will be cancelled. What is the long-run probability that a concert will be cancelled? O 1/4 O None of the others are correct 3/4 2/3 4/5 O 7/10arrow_forwardI need help solving part B please.. The computer center at Rockbottom University has been experiencing computer downtime. Let us assume that the trials of an associated Markov process are defined as one-hour periods and that the probability of the system being in a running state or a down state is based on the state of the system in the previous period. Historical data show the following transition probabilities. To From Running Down Running 0.90 0.10 Down 0.20 0.80 (a) If the system is initially running, what is the probability of the system being down in the next hour of operation? The asnwer for part A is .10! (b) What are the steady-state probabilities of the system being in the running state and in the down state? (Enter your probabilities as fractions.) Running?1= ? Down?2= ?arrow_forward
- A study of pine nut crops in the American southwest from 1940 to 1947 hypothised that nut production followed a Markov chain. The data suggested that if one year's crop was good, then the probabilities that the following year's crop would be good, fair, or poor were 0.08, 0.07, and 0.85, respectively; if one year's crop was fair, then the probabilities that the follow-ing year's crop would be good, fair, or poor were 0.09, 0.11, and 0.80, respectively; if one year's crop was poor, then the probabilities that the following year's crop would be good, fair, or poor were 0.11, 0.05, and 0.84, respectively. (a) Write down the transition matrix for this Markov chain. (b) If the pine nut crop was good in 1940, find the probabilities of a good crop in the years 1941 through 1945. (c) In the long run, what proportion of the crops will be good, fair, and poorarrow_forwardAt Community College, 10% of all business majors switched to another major the next semester, while the remaining 90% continued as business majors. Of all nonbusiness majors, 20% switched to a business major the following semester, while the rest did not. Set up these data as a Markov transition matrix, and calculate the probability that a business major will no longer be a business major in the long runarrow_forwardA system consists of five components, each can be operational or not. Each day one operational component is used and it will fail with probability 20%. Any time there no operational components at the end of a day, maintenance will be performed and all non-operational components will be repaired (with probability 1). The system does not perform any other tasks on the day of repairs. Model the system as a Markov chain Write down equations for determining long-run proportions. Suppose that you are interested in the average number of days that the system is under repair. Explain how you would find it using your model.arrow_forward
- A factory worker will quit with probability 1⁄2 during her first month, with probability 1⁄4 during her second month and with probability 1/8 each month after that. Whenever someone quits, their replacement will start at the beginning of the next month and follow the same pattern. Model this position’s status as a Markov chain. What is the long-run probability of having a new employee on a given month? please provide steps and explanations for answersarrow_forwardFinal Answer only pleasearrow_forwardTopic: MARKOV CHAINS In a sample of 400 Internet subscribers taken in late 2000, 80% were connected by telephone, and the rest via cable modem.the rest via cable modem. At the end of 2001, the number of subscribers who switched from telephone to cable modem connection was 110; and the number of subscribers who switched from telephone to cable modem connection was 110.modem connection was 110; and the number of subscribers switching from cable modem to telephone connection was 24. B) What proportion of subscribers will each system have by the end of 2002?arrow_forward
arrow_back_ios
SEE MORE QUESTIONS
arrow_forward_ios
Recommended textbooks for you
- Discrete Mathematics and Its Applications ( 8th I...MathISBN:9781259676512Author:Kenneth H RosenPublisher:McGraw-Hill EducationMathematics for Elementary Teachers with Activiti...MathISBN:9780134392790Author:Beckmann, SybillaPublisher:PEARSON
- Thinking Mathematically (7th Edition)MathISBN:9780134683713Author:Robert F. BlitzerPublisher:PEARSONDiscrete Mathematics With ApplicationsMathISBN:9781337694193Author:EPP, Susanna S.Publisher:Cengage Learning,Pathways To Math Literacy (looseleaf)MathISBN:9781259985607Author:David Sobecki Professor, Brian A. MercerPublisher:McGraw-Hill Education
Discrete Mathematics and Its Applications ( 8th I...
Math
ISBN:9781259676512
Author:Kenneth H Rosen
Publisher:McGraw-Hill Education
Mathematics for Elementary Teachers with Activiti...
Math
ISBN:9780134392790
Author:Beckmann, Sybilla
Publisher:PEARSON
Thinking Mathematically (7th Edition)
Math
ISBN:9780134683713
Author:Robert F. Blitzer
Publisher:PEARSON
Discrete Mathematics With Applications
Math
ISBN:9781337694193
Author:EPP, Susanna S.
Publisher:Cengage Learning,
Pathways To Math Literacy (looseleaf)
Math
ISBN:9781259985607
Author:David Sobecki Professor, Brian A. Mercer
Publisher:McGraw-Hill Education
Finite Math: Markov Chain Example - The Gambler's Ruin; Author: Brandon Foltz;https://www.youtube.com/watch?v=afIhgiHVnj0;License: Standard YouTube License, CC-BY
Introduction: MARKOV PROCESS And MARKOV CHAINS // Short Lecture // Linear Algebra; Author: AfterMath;https://www.youtube.com/watch?v=qK-PUTuUSpw;License: Standard Youtube License
Stochastic process and Markov Chain Model | Transition Probability Matrix (TPM); Author: Dr. Harish Garg;https://www.youtube.com/watch?v=sb4jo4P4ZLI;License: Standard YouTube License, CC-BY