To train a system with our modified CTC algorithm, we need the likelihood P(a|x) for reference transcription a given input x. We can compute it with the monotonic_forward algorithm. Table cell table[u, t] will store the likelihood of all prefixes b₁:t = b₁,..., bt which reduce to prefix a1:u = a1,..., Au table[u, t] = P(b1t|x) b1:tЄB-1 (a1:uit) B-1 (a1ut) = {bit | B(bit) = a1:u} 2a) Express table[u, t] as a function of P(bt), table[u, t - 1], and table [u - 1, t - 1]. Hint: use the notation P(bt = v|x) to indicate that b takes some type v Є V. == 2b) Next, write an expression for elements of the dummy row table [0, t] and dummy column table[u, 0]. Hint: table [1,0] = 0 but table [0, 1] = P(b₁ = ε). Consider why, and how these expressions extend to u > 1 and t> 1.
To train a system with our modified CTC algorithm, we need the likelihood P(a|x) for reference transcription a given input x. We can compute it with the monotonic_forward algorithm. Table cell table[u, t] will store the likelihood of all prefixes b₁:t = b₁,..., bt which reduce to prefix a1:u = a1,..., Au table[u, t] = P(b1t|x) b1:tЄB-1 (a1:uit) B-1 (a1ut) = {bit | B(bit) = a1:u} 2a) Express table[u, t] as a function of P(bt), table[u, t - 1], and table [u - 1, t - 1]. Hint: use the notation P(bt = v|x) to indicate that b takes some type v Є V. == 2b) Next, write an expression for elements of the dummy row table [0, t] and dummy column table[u, 0]. Hint: table [1,0] = 0 but table [0, 1] = P(b₁ = ε). Consider why, and how these expressions extend to u > 1 and t> 1.
Related questions
Question
Expert Solution
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
Step by step
Solved in 2 steps