disc09-regular-sols

pdf

School

Hong Kong Polytechnic University *

*We aren’t endorsed by this school

Course

273

Subject

Computer Science

Date

Nov 24, 2024

Type

pdf

Pages

3

Uploaded by lixun73230

Report
CS 188 Spring 2023 Regular Discussion 9 Solutions 1 Bayes’ Nets: Representation and Independence Parts (a) and (b) pertain to the following Bayes’ Net. A B C D E F G (a) Express the joint probability distribution as a product of terms representing individual conditional prob- abilities tables associated with the Bayes Net. P ( A ) P ( C | A ) P ( B | A ) P ( D | B ) P ( E ) P ( F | D, E ) P ( G | D ) (b) Assume each node can take on 4 values. How many entries do the factors at A, D, and F have? A: 4 D: 4 2 F: 4 3 1
Consider the following probability distribution tables. The joint distribution P ( A, B, C, D ) is equal to the product of these probability distribution tables. A P ( A ) +a 0.8 –a 0.2 A B P ( B | A ) +a +b 0.9 +a –b 0.1 –a +b 0.6 –a –b 0.4 B C P ( C | B ) +b +c 0.8 +b –c 0.2 –b +c 0.8 –b –c 0.2 C D P ( D | C ) +c +d 0.25 +c –d 0.75 –c +d 0.5 –c –d 0.5 (c) State all non-conditional independence assumptions that are implied by the probability distribution tables. From the tables, we have A ̸⊥⊥ B and C ̸⊥⊥ D . Then, we have every remaining pair of variables: A ⊥⊥ C, A ⊥⊥ D, B ⊥⊥ C, B ⊥⊥ D You are building advanced safety features for cars that can warn a driver if they are falling asleep ( A ) and also calculate the probability of a crash ( C ) in real time. You have at your disposal 6 sensors (random variables): E : whether the driver’s eyes are open or closed W : whether the steering wheel is being touched or not L : whether the car is in the lane or not S : whether the car is speeding or not H : whether the driver’s heart rate is somewhat elevated or resting R : whether the car radar detects a close object or not A influences { E, W, H, L, C } . C is influenced by { A, S, L, R } . (d) Draw the Bayes Net associated with the description above by adding edges between the provided nodes where appropriate. A S R E W H L C 2
2 HMMs Consider the following Hidden Markov Model. O 1 and O 2 are supposed to be shaded. W 1 P ( W 1 ) 0 0.3 1 0.7 W t W t +1 P ( W t +1 | W t ) 0 0 0.4 0 1 0.6 1 0 0.8 1 1 0.2 W t O t P ( O t | W t ) 0 a 0.9 0 b 0.1 1 a 0.5 1 b 0.5 Suppose that we observe O 1 = a and O 2 = b . Using the forward algorithm, compute the probability distribution P ( W 2 | O 1 = a, O 2 = b ) one step at a time. (a) Compute P ( W 1 , O 1 = a ). P ( W 1 , O 1 = a ) = P ( W 1 ) P ( O 1 = a | W 1 ) P ( W 1 = 0 , O 1 = a ) = (0 . 3)(0 . 9) = 0 . 27 P ( W 1 = 1 , O 1 = a ) = (0 . 7)(0 . 5) = 0 . 35 (b) Using the previous calculation, compute P ( W 2 , O 1 = a ). P ( W 2 , O 1 = a ) = w 1 P ( w 1 , O 1 = a ) P ( W 2 | w 1 ) P ( W 2 = 0 , O 1 = a ) = (0 . 27)(0 . 4) + (0 . 35)(0 . 8) = 0 . 388 P ( W 2 = 1 , O 1 = a ) = (0 . 27)(0 . 6) + (0 . 35)(0 . 2) = 0 . 232 (c) Using the previous calculation, compute P ( W 2 , O 1 = a, O 2 = b ). P ( W 2 , O 1 = a, O 2 = b ) = P ( W 2 , O 1 = a ) P ( O 2 = b | W 2 ) P ( W 2 = 0 , O 1 = a, O 2 = b ) = (0 . 388)(0 . 1) = 0 . 0388 P ( W 2 = 1 , O 1 = a, O 2 = b ) = (0 . 232)(0 . 5) = 0 . 116 (d) Finally, compute P ( W 2 | O 1 = a, O 2 = b ). Renormalizing the distribution above, we have P ( W 2 = 0 | O 1 = a, O 2 = b ) = 0 . 0388 / (0 . 0388 + 0 . 116) 0 . 25 P ( W 2 = 1 | O 1 = a, O 2 = b ) = 0 . 116 / (0 . 0388 + 0 . 116) 0 . 75 3
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help