Concept explainers
First Paradox: Under certain circumstances, you have your best chance of winning a tennis
tournament if you play most of your games against the best possible opponent.
Alice and her two sisters, Betty and Carol, are avid tennis players. Betty is the best of the three sisters, and Carol plays at the same level as Alice. Alice defeats Carol 50% of the time but only defeats Betty 40% of the time.
Alice’s mother offers to give her $100 if she can win two consecutive games when playing three alternating games against her two sisters. Since the games will alternate, Alice has two possibilities for the sequence of opponents. One possibility is to play the first game against Betty, followed by a game with Carol, and then another game with Betty. We will refer to this sequence as BCB. The other possible sequence is CBC.
Calculate the probability of Alice getting the $100 reward if she chooses the sequence CBC.
Want to see the full answer?
Check out a sample textbook solutionChapter 6 Solutions
Finite Mathematics & Its Applications (12th Edition)
- Bob and Doug play a lot of Ping-Pong, but Doug is a much better player, and wins 60% of their games.To make up for this, if Doug wins a game he will spot Bob five points in their next game. If Doug wins again he will spot Bob ten points the next game, and if he still wins the next game he will spot him fifteen points, and continue to spot him fifteen points as long as he keeps winning. Whenever Bob wins a game he goes back to playing the next game with no advantage.It turns out that with a five-point advantage Bob wins 60% of the time; he wins 60% of the time with a ten-point advantage and 60% of the time with a fifteen-point advantage.Model this situation as a Markov chain using the number of consecutive games won by Doug as the states. There should be four states representing zero, one, two, and three or more consecutive games won by Doug. Find the transition matrix of this system, the steady-state vector for the system, and determine the proportion of games that Doug will win in the…arrow_forwardBob and Doug play a lot of Ping-Pong, but Doug is a much better player, and wins 60% of their games. To make up for this, if Doug wins a game he will spot Bob five points in their next game. If Doug wins again he will spot Bob ten points the next game, and if he still wins the next game he will spot him fifteen points, and continue to spot him fifteen points as long as he keeps winning. Whenever Bob wins a game he goes back to playing the next game with no advantage. It turns out that with a five-point advantage Bob wins 70% of the time; he wins 80% of the time with a ten-point advantage and 90% of the time with a fifteen-point advantage. Model this situation as a Markov chain using the number of consecutive games won by Doug as the states. There should be four states representing zero, one, two, and three or more consecutive game won by Doug. Find the transition matrix of this system, the steady-state vector for the system, and determine the proportion of games that Doug will win in…arrow_forwardBob and Doug play a lot of Ping-Pong, but Doug is a much better player, and wins 60% of their games. To make up for this, if Doug wins a game he will spot Bob five points in their next game. If Doug wins again he will spot Bob ten points the next game, and if he still wins the next game he will spot him fifteen points, and continue to spot him fifteen points as long as he keeps winning. Whenever Bob wins a game he goes back to playing the next game with no advantage. It turns out that with a five-point advantage Bob wins 40% of the time; he wins 70% of the time with a ten-point advantage and 70% of the time with a fifteen-point advantage. Model this situation as a Markov chain using the number of consecutive games won by Doug as the states. There should be four states representing zero, one, two, and three or more consecutive games won by Doug. Find the transition matrix of this system, the steady-state vector for the system, and determine the proportion of games that Doug will win in…arrow_forward
- Bob and Doug play a lot of Ping-Pong, but Doug is a much better player, and wins 80% of their games. To make up for this, if Doug wins a game he will spot Bob five points in their next game. If Doug wins again he will spot Bob ten points the next game, and if he still wins the next game he will spot him fifteen points, and continue to spot him fifteen points as long as he keeps winning. Whenever Bob wins a game he goes back to playing the next game with no advantage. It turns out that with a five-point advantage Bob wins 40% of the time; he wins 70% of the time with a ten-point advantage and 90% of the time with a fifteen-point advantage. Model this situation as a Markov chain using the number of consecutive games won by Doug as the states. There should be four states representing zero, one, two, and three or more consecutive games won by Doug. Find the transition matrix of this system, the steady-state vector for system, and determine the proportion games that Doug will win the long…arrow_forwardBob and Doug play a lot of Ping-Pong, but Doug is a much better player, and wins 80% of their games. To make up for this, if Doug wins a game he will spot Bob five points in their next game. If Doug wins again he will spot Bob ten points the next game, and if he still wins the next game he will spot him fifteen points, and continue to spot him fifteen points as long as he keeps winning. Whenever Bob wins a game he goes back to playing the next game with no advantage. It turns out that with a five-point advantage Bob wins 20% of the time; he wins 50% of the time with a ten-point advantage and 80% of the time with a fifteen-point advantage. Model this situation as a Markov chain using the number of consecutive games won by Doug as the states. There should be four states representing zero, one, two, and three or more consecutive games won by Doug. Find the transition matrix of this system, the steady-state vector for the system, and determine the proportion of games that Doug will win in…arrow_forwardSuppose Janice has a 25% chance of totaling her car (worth $13,500) this year and Sam has a 19% chance of totaling his car ($23,700) this year. If you have a car insurance company and you want to offer an insurance policy to these two customers and offer it for the same price, what price should you charge (without any profit mark-up)?arrow_forward
- A ticket between Corellia and Dantooine (located in a galaxy far far away) sells for $150. The plane can hold 100 people. It costs $8000 to fly an empty plane. Each person on the plane incurs variable costs of $30 (for food and fuel). If the flight is overbooked, anyone who cannot get a seat receives $300 in compensation. On average, 95% of all people who have a reservation show up for the flight. To maximize profit, how many reservations for the flight should you book? Hint: You can assume that the number of people that show up for a flight follows a binomial random variable.arrow_forwardSuppose Kim is deciding whether to open a gym or a diner. Her friend Tim, who is a finance expert, advised that there is a 45% chance that the gym will be successful and return a payoff of $55000 and a 55% chance that it will fail and lose $9500. Tim also advised that there is an 80% chance that the diner will be successful with a payoff of $30000 and a 20% chance of failure with a loss of $6000. 13) What is the expected payoff for the gym? a) 18000 b) 16525 c) 19555 d) 19525arrow_forwardJanus Seagull had a car accident and was out of work for a year. Janus believes that the accident was caused by a vehicle defect. He consulted some lawyers and planned to sue the vehicle manufacturer. During negotiations, the legal team of the vehicle manufacturer offered a $700K settlement. However, Janus needs to settle the $100K in legal fees. Janus asked his lawyer for advice and the lawyer told him that he has a 50% chance of winning the case. If Janus loses, he will incur legal fees of $75K. If he wins, the full requested settlement is also not guaranteed. The lawyer believes that there is a 50% chance that Janus will receive full settlement of $3 million, of which Janus needs to settle $1 million in legal fees. There is a 50% chance that the jury will award Janus $1 million, of which 50% will be taken up by legal fees.arrow_forward
- . The table below states the payoffs in political points (measured in billions of rubles) to two nations that are rivals in world politics, Russia and Ukraine. Each country can take one of two courses: peace; or war. In each cell, the first payoff is for Russia, and the second payoff is for Ukraine. (a) Assume that neither country observes the military strategy of its rival, and solve the game (if it can be solved). Explain your solution step-by-step. Does this outcome maximize total political points? (b) In general, what is a Nash equilibrium? Is the solution to this game a Nash equilibrium? (c) Suppose that each country deposits a fund of two billion rubles with the United Nations. Either country would forfeit this fund if it wages war. What is the solution now to the game? Is this a Nash equilibrium? Russia, Ukraine Peace War Peace 4,4 1, 5 War 5, 1 2,2arrow_forwardA friend who lives in Los Angeles makes frequent consultingtrips to Washington, D.C.; 50% of the time shetravels on airline #1, 30% of the time on airline #2, and the remaining 20% of the time on airline #3. For airline#1, flights are late into D.C. 30% of the time and late intoL.A. 10% of the time. For airline #2, these percentagesare 25% and 20%, whereas for airline #3 the percentagesare 40% and 25%. If we learn that on a particular trip shearrived late at exactly one of the two destinations, whatare the posterior probabilities of having flown on airlines#1, #2, and #3? Assume that the chance of a late arrival inL.A. is unaffected by what happens on the flight to D.C.[Hint: From the tip of each first-generation branch on atree diagram, draw three second-generation brancheslabeled, respectively, 0 late, 1 late, and 2 late.]arrow_forwardWhile walking around you see a carnival game! The game has a 1% chance of you winning $150, a 3% chance of you winning $20, a 6% chance of you winning $6, and an 90% chance of you winning nothing. If the game costs $3 to play, how much should you expect to win (or lose) if you play this game once? For the same situation, how much should you expect to win (or lose) if you play the game 300 times?arrow_forward
- Big Ideas Math A Bridge To Success Algebra 1: Stu...AlgebraISBN:9781680331141Author:HOUGHTON MIFFLIN HARCOURTPublisher:Houghton Mifflin Harcourt