What is SF,3 and SL,3? Use the A and E matrix given above, the observed string 'THTH', and use SF,2 = 0.17 and SL,2 = 0.28 in your calculation.

Computer Networking: A Top-Down Approach (7th Edition)
7th Edition
ISBN:9780133594140
Author:James Kurose, Keith Ross
Publisher:James Kurose, Keith Ross
Chapter1: Computer Networks And The Internet
Section: Chapter Questions
Problem R1RQ: What is the difference between a host and an end system? List several different types of end...
icon
Related questions
Question

Please provide workable code in python that solves question 2. Please note that I am not looking for explannation but code.

 

Viterbi Algorithm
While calculating the probability of data under an HMM as done above is straightforward, we typically do not know the hidden states. For example, it was easy
to calculate the probability of a sequence given two Markov models, one for generating sequence within a CpG island and the other for generating sequence
outside a CpG island. HMMs are particularly useful when we do not know whether a coin is Fair or Loaded, or whether a sequence is within a CpG island or
not.
The Viterbi algorithm is a dynamic programming algorithm for finding the most likely sequence of hidden states-called the Viterbi path-that results in a
sequence of observed events in the context of a hidden Markov model. The Viterbi algorithm is known as a decoding algorithm.
To find the most likely sequence of hidden states, the Viterbi uses dynamic programming to iteratively calculated the score of the most likely path up to a
certain state (k), using scores from prior calculation of the prior state (k-1). In assessing the most like likely path, it is exhaustive.
Sk,i = score of the most likely path up to step / with p; = k
SFair, 3
:
...
Pi-1
00
Xi-1
H
P₁
F
T
X1
H
T
2 - -
F
L
→
X2
H
T
24
F
To calculate the most likely path out of all possible paths, at each transition the Viterbi asks which path is most likely. So if we want to calculate the probability
of a Fair coin at position i in the sequence of tosses, we need to consider two possibilities: the coin was fair in the prior state or the coin was loaded in the prior
state.
X3
H
T
SFair, i-1 • P(Fair | Fair)
max
SLoaded, i-1 P(Fair | Loaded)
Pi
F
L
Xi
Pn
F
*BE
T
Xn
HP(Heads | Fair)
Transcribed Image Text:Viterbi Algorithm While calculating the probability of data under an HMM as done above is straightforward, we typically do not know the hidden states. For example, it was easy to calculate the probability of a sequence given two Markov models, one for generating sequence within a CpG island and the other for generating sequence outside a CpG island. HMMs are particularly useful when we do not know whether a coin is Fair or Loaded, or whether a sequence is within a CpG island or not. The Viterbi algorithm is a dynamic programming algorithm for finding the most likely sequence of hidden states-called the Viterbi path-that results in a sequence of observed events in the context of a hidden Markov model. The Viterbi algorithm is known as a decoding algorithm. To find the most likely sequence of hidden states, the Viterbi uses dynamic programming to iteratively calculated the score of the most likely path up to a certain state (k), using scores from prior calculation of the prior state (k-1). In assessing the most like likely path, it is exhaustive. Sk,i = score of the most likely path up to step / with p; = k SFair, 3 : ... Pi-1 00 Xi-1 H P₁ F T X1 H T 2 - - F L → X2 H T 24 F To calculate the most likely path out of all possible paths, at each transition the Viterbi asks which path is most likely. So if we want to calculate the probability of a Fair coin at position i in the sequence of tosses, we need to consider two possibilities: the coin was fair in the prior state or the coin was loaded in the prior state. X3 H T SFair, i-1 • P(Fair | Fair) max SLoaded, i-1 P(Fair | Loaded) Pi F L Xi Pn F *BE T Xn HP(Heads | Fair)
Let's calculate the probability of the most likely path. This involves calculating each Ski, where k is the current state, and i is the current position in the
sequence. Similar to the Needleman-Wunch algorithm, we also need to keep track of which path (most likely transition) at each step so that we can use a
backtrace to finally calculate the most likely path.
Consider the transition (A) and emission (E) matrix below and the observation 'THTH'.
What is Ski?
A:
F
L
FL
SF,2= P(H|F) max (Sk,1 P(F|k))
The maximum is of these two possibilities:
0.6 0.4
0.4
SL,2 = P(H|L) max (Sk,1 P(L|K))
max (SF,1 P(LF), SL,1. P(LL))
max (0.5 x 0.5 x 0.4, 0.5 x 0.2 x 0.6)
max (SF,1 P(F|F), SL,1 · P(F|L))
Filling in with numbers we have:
max (0.5 x 0.5 x 0.6, 0.5 x 0.2 x 0.4)
max is for a fair coin in prior state = 0.5 x 0.5 x 0.6
Thus, SF,2 = 0.5 x 0.5 x 0.5 x 0.6
0.6
E:
Lets start with initial probabilities I of states F, L = [0.5, 0.5]. Initial state probabilities are needed, and we'll arbitrarily assume equal chances for each.
SF₁1=IXE = 0.5 x 0.5
SL,1 = IXE= 0.5 x 0.2
max is for a fair coin in prior state = 0.5 x 0.5 x 0.4
Thus, SL,2 = 0.8 x 0.5 x 0.5 x 0.4
HT
Once we have these initial values of Sk,1 we can use the recursion formulate for each subsequent position in the sequences of 'THTH' observations.
Sk,i = E.max (Ski-1. A)
F 0.5
L 0.8 0.2
0.5 0.5
In addition to recording SF,2 and SL,2, we also need to record which was the most likely path for traceback. In both cases the most likely path to SF,2 and SL,2
was from SF,1. Subsequent calculations of Ski can be made in similar ways using the preceeding Ski-1 values.
Question 2
What is SF3 and SL,3?
Use the A and E matrix given above, the observed string 'THTH', and use SF,2 = 0.17 and SL,2 = 0.28 in your calculation.
Transcribed Image Text:Let's calculate the probability of the most likely path. This involves calculating each Ski, where k is the current state, and i is the current position in the sequence. Similar to the Needleman-Wunch algorithm, we also need to keep track of which path (most likely transition) at each step so that we can use a backtrace to finally calculate the most likely path. Consider the transition (A) and emission (E) matrix below and the observation 'THTH'. What is Ski? A: F L FL SF,2= P(H|F) max (Sk,1 P(F|k)) The maximum is of these two possibilities: 0.6 0.4 0.4 SL,2 = P(H|L) max (Sk,1 P(L|K)) max (SF,1 P(LF), SL,1. P(LL)) max (0.5 x 0.5 x 0.4, 0.5 x 0.2 x 0.6) max (SF,1 P(F|F), SL,1 · P(F|L)) Filling in with numbers we have: max (0.5 x 0.5 x 0.6, 0.5 x 0.2 x 0.4) max is for a fair coin in prior state = 0.5 x 0.5 x 0.6 Thus, SF,2 = 0.5 x 0.5 x 0.5 x 0.6 0.6 E: Lets start with initial probabilities I of states F, L = [0.5, 0.5]. Initial state probabilities are needed, and we'll arbitrarily assume equal chances for each. SF₁1=IXE = 0.5 x 0.5 SL,1 = IXE= 0.5 x 0.2 max is for a fair coin in prior state = 0.5 x 0.5 x 0.4 Thus, SL,2 = 0.8 x 0.5 x 0.5 x 0.4 HT Once we have these initial values of Sk,1 we can use the recursion formulate for each subsequent position in the sequences of 'THTH' observations. Sk,i = E.max (Ski-1. A) F 0.5 L 0.8 0.2 0.5 0.5 In addition to recording SF,2 and SL,2, we also need to record which was the most likely path for traceback. In both cases the most likely path to SF,2 and SL,2 was from SF,1. Subsequent calculations of Ski can be made in similar ways using the preceeding Ski-1 values. Question 2 What is SF3 and SL,3? Use the A and E matrix given above, the observed string 'THTH', and use SF,2 = 0.17 and SL,2 = 0.28 in your calculation.
Expert Solution
steps

Step by step

Solved in 3 steps

Blurred answer
Recommended textbooks for you
Computer Networking: A Top-Down Approach (7th Edi…
Computer Networking: A Top-Down Approach (7th Edi…
Computer Engineering
ISBN:
9780133594140
Author:
James Kurose, Keith Ross
Publisher:
PEARSON
Computer Organization and Design MIPS Edition, Fi…
Computer Organization and Design MIPS Edition, Fi…
Computer Engineering
ISBN:
9780124077263
Author:
David A. Patterson, John L. Hennessy
Publisher:
Elsevier Science
Network+ Guide to Networks (MindTap Course List)
Network+ Guide to Networks (MindTap Course List)
Computer Engineering
ISBN:
9781337569330
Author:
Jill West, Tamara Dean, Jean Andrews
Publisher:
Cengage Learning
Concepts of Database Management
Concepts of Database Management
Computer Engineering
ISBN:
9781337093422
Author:
Joy L. Starks, Philip J. Pratt, Mary Z. Last
Publisher:
Cengage Learning
Prelude to Programming
Prelude to Programming
Computer Engineering
ISBN:
9780133750423
Author:
VENIT, Stewart
Publisher:
Pearson Education
Sc Business Data Communications and Networking, T…
Sc Business Data Communications and Networking, T…
Computer Engineering
ISBN:
9781119368830
Author:
FITZGERALD
Publisher:
WILEY