Please written by computer source An LSTM with a single hidden layer has 2 neurons in all its gates. The forget gate weight matrix (Wf), the input gate weight matrix (Wi) and the output gate weight matrix (Wo) are all given. Wf=Wi=Wo=[[-0..8, 0, -0.3,-0.75, 0.26, 0.09];[0.05, -0.1, -0.9, 0, 0.85, 0.07]].Where the first two columns are the weights for the CEC memory (i.e Wfc/Wic/Woc) the following two columns are weights for hidden memory (i.e Wfh/Wih/Who) and the last two columns are weights for the input (i.e Wfx/Wix/Wox). The input pattern detection weights Wc are given by The input pattern detection weights Wc are given by Wc = [[0.8, 0.9, 0.3, 1];[0.3, 0.45, 0.7, 0.5]]. Where the first two columns are the weights for the hidden vector (Wch) and the following two are the weights for the input (Wcx). Let us define the “memory duration” of the network as the minimum number of time steps N such that for an isolated input at time t (with no previous or subsequent inputs) the length of the cell memory activation vector at time t+N almost certainly falls to less than 1/100th of its value at time t. This is effectively the amount of time in which the influence of the input at time t all but vanishes from the network.The input to the network at time t x(t) is [-1 1]^T where T is transpose. The CEC memory and the hidden memory are both zero vectors initially. What is the memory duration of the above network (choose the closest answer). You may want to simulate the network to determine this value (an analytical solution is difficult to get). Hint: You may want to check the Recurrent Networks:Stability analysis and LSTMs lecture slide 88 if not sure about what the weights are.

Database System Concepts
7th Edition
ISBN:9780078022159
Author:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Chapter1: Introduction
Section: Chapter Questions
Problem 1PE
icon
Related questions
Question

Please written by computer source

An LSTM with a single hidden layer has 2 neurons in all its gates. The forget gate weight matrix (Wf), the input gate weight matrix (Wi) and the output gate weight matrix (Wo) are all given. Wf=Wi=Wo=[[-0..8, 0, -0.3,-0.75, 0.26, 0.09];[0.05, -0.1, -0.9, 0, 0.85, 0.07]].Where the first two columns are the weights for the CEC memory (i.e Wfc/Wic/Woc) the following two columns are weights for hidden memory (i.e Wfh/Wih/Who) and the last two columns are weights for the input (i.e Wfx/Wix/Wox). The input pattern detection weights Wc are given by The input pattern detection weights Wc are given by Wc = [[0.8, 0.9, 0.3, 1];[0.3, 0.45, 0.7, 0.5]]. Where the first two columns are the weights for the hidden vector (Wch) and the following two are the weights for the input (Wcx). Let us define the “memory duration” of the network as the minimum number of time steps N such that for an isolated input at time t (with no previous or subsequent inputs) the length of the cell memory activation vector at time t+N almost certainly falls to less than 1/100th of its value at time t. This is effectively the amount of time in which the influence of the input at time t all but vanishes from the network.The input to the network at time t x(t) is [-1 1]^T where T is transpose. The CEC memory and the hidden memory are both zero vectors initially. What is the memory duration of the above network (choose the closest answer). You may want to simulate the network to determine this value (an analytical solution is difficult to get). Hint: You may want to check the Recurrent Networks:Stability analysis and LSTMs lecture slide 88 if not sure about what the weights are.

Expert Solution
steps

Step by step

Solved in 2 steps

Blurred answer
Knowledge Booster
Disjoint Set forest
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, computer-science and related others by exploring similar questions and additional content below.
Similar questions
  • SEE MORE QUESTIONS
Recommended textbooks for you
Database System Concepts
Database System Concepts
Computer Science
ISBN:
9780078022159
Author:
Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:
McGraw-Hill Education
Starting Out with Python (4th Edition)
Starting Out with Python (4th Edition)
Computer Science
ISBN:
9780134444321
Author:
Tony Gaddis
Publisher:
PEARSON
Digital Fundamentals (11th Edition)
Digital Fundamentals (11th Edition)
Computer Science
ISBN:
9780132737968
Author:
Thomas L. Floyd
Publisher:
PEARSON
C How to Program (8th Edition)
C How to Program (8th Edition)
Computer Science
ISBN:
9780133976892
Author:
Paul J. Deitel, Harvey Deitel
Publisher:
PEARSON
Database Systems: Design, Implementation, & Manag…
Database Systems: Design, Implementation, & Manag…
Computer Science
ISBN:
9781337627900
Author:
Carlos Coronel, Steven Morris
Publisher:
Cengage Learning
Programmable Logic Controllers
Programmable Logic Controllers
Computer Science
ISBN:
9780073373843
Author:
Frank D. Petruzella
Publisher:
McGraw-Hill Education