cs571_Week_3_Lecture_questions_2

pdf

School

University of Nevada, Las Vegas *

*We aren’t endorsed by this school

Course

571

Subject

Computer Science

Date

Jun 11, 2024

Type

pdf

Pages

6

Uploaded by SuperInternet12237

Report
What are the different gates present in LSTM? The remember, forget and update gate The forget, update and output gate The memory, update and output gate The remember, update and output gate Correct! These three gates help retain information in LSTM for a longer time. What component of an LSTM determines whether information in the cell state is retained? The update gate The memory gate The remember gate The forget gate (@) correct Correct! We scale by values ranging from 0 to 1 to determine what to keep from the cell state. Which component of an LSTM allows for storing information across an arbitrary number of steps? The cellstate The hidden state The output gate The remember gate @ comeet & Correct! Information in the cell state is maintained by the forget gate. Itis updated, determined by the input gate. It s also used, determined by the output gate. Inthe robot bottle lifting example, using LSTM layers as part of the model allowed the robot to utiliz which type of information? Spatial information Temporalinformation Tactile information Proprioceptive information Correct! Temporal information relates to time. Information from previous time steps is important for a dynamic task like rotating and flipping a bottle. @ correct
1. Which recurrent element is incorporated along with the hidden state in LSTMs? O Remember gste O Forgetaste [:@ Cell state. O Memorysate @ comect Correct! The updated cellstate will be used in conjunction with the current input and previous hidden state to output the next hidden state. 2. Which recurrent element s incorporated slong with the celstste in LSTMe? O Forgetgate O Remember gate @ Hiddensate O Memorystate © comect Correct! The previous hidden state il be used in conjunction with the current input and updated cell state o output the next hidden state. 3. Each gate within an LSTM unit uses which activation functions? QO Tanhactivation functions © Sigmoid or tanh activation functions @ Sigmoid sctvation functions O sigmoid or ReLU activation functions @ comeat Correct! Given that each gate is an elementwise scalar that determines how much signal passes through, it makes sense to use values between 0 and 1. 11 point 1/1point 111 point
4. Which problem for RNNs was the LSTM developed to address? 1/1point O Memory leaks QO Lackof gating units @ Vanishing gradients QO Too many parameters @ correct Correct! By using the cell state and calculating partial derivatives with respect to the weights of the forget gate, we don't need to rely on backpropagation through time to update parameters dealing with distant previous steps. 5. Whatis a drawback or weakness of the LSTM? 1/1point (@ Requires training for many more parameters O Requires a larger testing dataset QO Exploding gradients Q Vanishing gradients © correct Correct! Additional weights must be learned for each gate of the LSTM. 6. What are the three gates of an LSTM? 1/1 point @ Input gate, forget gate, output gate O Update gate, remember gate, cell gate (O Update gate, forget gate, hidden gate O Input gate, remember gate, output gate () correct Correct! There are other variants of RNNs, such as GRUs, which only use an update and reset gate. These will not be covered in the scope of this course. Why s a fully connected output layer needed after the recurrent layer? To properly scale the RNN's activations to target outputs. To sid in performing backpropagation through time. To ready the hidden state for the next forward pass. To connectinput to output. @ comeet Correct s unlikely that the RNN's actiations will directly ft o the trget outputs, 5o we pass e through an additional fully connected layer and can apply a diferent actation function as needed.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
Fora single sequence of inputs, how often i the forward() method in an RNNCellcalled? The forward method s called for each element in an input sequence of arbitrary length, The forward method s called for each element in an input sequence of fixed length. The forward method s called once. The RNNCell will automatically call the forward method for each element of a sequence. © comect Correct! At each step in the sequence, that specific input element and the previous hidden state are passed in, and this can be repeated as needed. What is the major difference between dealing with torch.nn.RNNCell() and torch.nn.LSTMCell()? Additional parameters Hidden state Fully connected output layer Cellstate ) Correct Correct! The cell state allows for the LSTM to have greater access to information further back than an RNN might. 1. Which PyTorch class can be used as a single RN layer? 1/1 point QO torch.mn.RNNLayer() QO torch.mn.RNNCell() @ torch.nn.RNNCell() O torch.nn.RNNLayer() @ comeat Correct! Classes such as torch.nn.RNN() allow us to simply connect multiple RNNCell() layers in sequence.
2. In addition to the input, what is required to perform a forward pass through an RNN? @ The previous hidden state O The previous cell state O The current hidden state Q) The previous output 'Q)- Correct Correct! The previous hidden state will be combined into the current hidden state. 3. Whatisthe most common way to initialize the hidden state of an RNN? O Previous hidden state initialization O Unity initialization O Random initialization Zeroinitialization G/) Correct Correct! The hidden state is most commonly set to be equal to a zero vector. 4. In addition to the input and previous hidden state, what is required to perform a forward pass through an LSTM? (O Current hidden state (O Current cell state @ Previous cell state (O Previous output G/)' Correct Correct! The cell state is how an LSTM passes on long-term memory. How are new input elements obtained that can be fed to the LSTM to make predictions over arbitrary distances? The hidden state generates additional inputs to be used in sequence. Append the LSTM’s predictions to the known input sequence. Take in a longer input sequence. Append the cell state to the known input sequence. Ga Correct 1/1 point 1/1 point 1/1 point Correct! This process can be repeated as many times as needed to predict further and further out without needing additional inputs from future timesteps.
1. 2. 3. Why is a feedforward neural network poorly suited to model a periodic series? 1/1 point O A feedforward neural network does not have enough layers. (@ Afeedforward neural network does not utilize temporal context. O A feedforward neural network does not have enough neurons. (O Afeedforward neural network does not utilize spatial context. G/) Correct Correct! Given a single point on a periodic time series, even humans are unable to usefully predict future time steps. For which task is an LSTM best suited? 1/1 point @ Predicting stock trends O Modeling dynamics O Identifying objects O Recognizing faces @ c \¥) Correct Correct! LSTMs are great for taking in a time series and predicting future values over arbitrary distances. Note that accuracy typically decreases the further out the predictions go. When handling data with dependencies on very distant time steps, which type of neural network should be used? 1/1 point O FrF Correct! As LSTMs are less prone to vanishing gradients, they make an excellent choice when dealing with data that rely on distant time steps. 4. Inwhat way should a model's loss change over the course of the training process? O The loss should always decrease but the rate of decrease will slow. O The loss should follow a smooth logarithmic curve. O The loss should always reach the minimum possible loss. @ The loss should decrease on average until a plateau is reached. G’) Correct Correct! If the testing loss begins increasing after this point, the model is likely overfitting to the training data. 1/1 point
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help