disc12-regular

pdf

School

Hong Kong Polytechnic University *

*We aren’t endorsed by this school

Course

273

Subject

Computer Science

Date

Nov 24, 2024

Type

pdf

Pages

2

Uploaded by lixun73230

Report
CS 188 Summer 2023 Discussion 4B 1 Particle Filtering Let’s use Particle Filtering to estimate the distribution of P ( W 2 | O 1 = a, O 2 = b ). Here’s the HMM again. W 1 P ( W 1 ) 0 0.3 1 0.7 W t W t +1 P ( W t +1 | W t ) 0 0 0.4 0 1 0.6 1 0 0.8 1 1 0.2 W t O t P ( O t | W t ) 0 a 0.9 0 b 0.1 1 a 0.5 1 b 0.5 We start with two particles representing our distribution for W 1 . P 1 : W 1 = 0 P 2 : W 1 = 1 Use the following random numbers to run particle filtering: [0.22, 0.05, 0.33, 0.20, 0.84, 0.54, 0.79, 0.66, 0.14, 0.96] (a) Observe : Compute the weight of the two particles after evidence O 1 = a . (b) Resample : Using the random numbers, resample P 1 and P 2 based on the weights. (c) Predict : Sample P 1 and P 2 from applying the time update. (d) Update : Compute the weight of the two particles after evidence O 2 = b . (e) Resample : Using the random numbers, resample P 1 and P 2 based on the weights. (f) What is our estimated distribution for P ( W 2 | O 1 = a, O 2 = b )? 2 MangoBot Human Detector Your startup company MangoBot wants to build robots that delivers packages on the road. One core module of the robot’s software is to detect whether a human is standing in front of it. We model the presence of humans with a Markov model: H 0 H 1 H 2 ... P ( H t +1 | H t ) P ( H t +1 | H t ) P ( H t +1 | H t ) where H t ∈ { 0 , 1 } corresponds to a human being absent or present respectively. The initial distribution and the transition probabilities are given as follows: 1
H 0 P ( H 0 ) 0 p 1 1 p H t H t +1 P ( H t +1 | H t ) 0 0 0.9 0 1 0.1 1 0 0.8 1 1 0.2 (a) Express the following quantities in terms of p : (i) P ( H 1 1) = (ii) lim t →∞ P ( H t = 0) = To make things simple, we stick to the original first-order Markov chain formulation. To make the detection more accurate, the company built a sensor that returns an observation O t each time step as a noisy measurement of the unknown H t . The new model is illustrated in the figure, and the relationship between H t and O t is provided in the table below. H 0 o 0 H 1 o 1 H 2 o 2 ... P ( H t +1 | H t ) P ( H t +1 | H t ) P ( H t +1 | H t ) H t O t P ( O t | H t ) 0 0 0.8 0 1 0.2 1 0 0.3 1 1 0.7 (b) Based on the observed sensor values o 0 , o 1 , · · · , o t , we now want the robot to find the most likely sequence H 0 , H 1 , · · · , H t indicating the presence/absence of a human up to the current time. (i) Suppose that [ o 0 , o 1 , o 2 ] = [0 , 1 , 1] are observed. The following ”trellis diagram” shows the possible state transitions. Fill in the values for the arcs labeled A, B, C, and D with the product of the transition probability and the observation likelihood for the destination state. The values may depend on p . Start H 0 = 0 H 1 = 0 H 0 = 1 H 1 = 1 H 2 = 0 H 2 = 1 C D A B (ii) There are two possible most likely state sequences, depending on the value of p . Complete the following (Write the sequence as ”x,y,z” (without quotes), where x, y, z are either 0 or 1): Hint: it might be helpful to complete the labelling of the trellis diagram above. When p < , the most likely sequence H 0 , H 1 , H 2 is . Otherwise, the most likely sequence H 0 , H 1 , H 2 is . (c) True or False: For a fixed p value and observations { o 0 , o 1 , o 2 } in general, H 1 , the most likely value for H 1 , is always the same as the value of H 1 in the most likely sequence H 0 , H 1 , H 2 . *$ True *$ False 2
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help