Consider the neural network in Figure 25.13. Let bias values be fixed at 0, and let the weight matrices between the input and hidden, and hidden and output layers, respectively, be: W=(w1, w2, wz) = (1, 1, –1) W = (w½ , w½, w½)" = (0.5, 1,2)" %3D Assume that the hidden layer uses ReLU, whereas the output layer uses sigmoid activation. Assume SSE error. Answer the following questions, when the input is x =

Database System Concepts
7th Edition
ISBN:9780078022159
Author:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Chapter1: Introduction
Section: Chapter Questions
Problem 1PE
icon
Related questions
Question
100%

dont post unnecessary answer dont answer if you dont know else direct dislike dont post copied answer

Q1. Consider the neural network in Figure 25.13. Let bias values be fixed at 0, and let
the weight matrices between the input and hidden, and hidden and output layers,
respectively, be:
W = (w1, w2, w3) = (1, 1, –1)
W = (w, w½, wý)™ = (0.5, 1, 2)"
Assume that the hidden layer uses ReLU, whereas the output layer uses sigmoid
activation. Assume SSE error. Answer the following questions, when the input is x =
4 and the true response is y = 0:
z1
wi
w2
22
ws
23
Figure 25.13. Neural network for Q1.
(a) Use forward propagation to compute the predicted output.
(b) What is the loss or error value?
(c) Compute the net gradient vector 8º for the output layer.
(d) Compute the net gradient vector &ª for the hidden layer.
Transcribed Image Text:Q1. Consider the neural network in Figure 25.13. Let bias values be fixed at 0, and let the weight matrices between the input and hidden, and hidden and output layers, respectively, be: W = (w1, w2, w3) = (1, 1, –1) W = (w, w½, wý)™ = (0.5, 1, 2)" Assume that the hidden layer uses ReLU, whereas the output layer uses sigmoid activation. Assume SSE error. Answer the following questions, when the input is x = 4 and the true response is y = 0: z1 wi w2 22 ws 23 Figure 25.13. Neural network for Q1. (a) Use forward propagation to compute the predicted output. (b) What is the loss or error value? (c) Compute the net gradient vector 8º for the output layer. (d) Compute the net gradient vector &ª for the hidden layer.
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 2 steps

Blurred answer
Knowledge Booster
Use of XOR function
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, computer-science and related others by exploring similar questions and additional content below.
Recommended textbooks for you
Database System Concepts
Database System Concepts
Computer Science
ISBN:
9780078022159
Author:
Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:
McGraw-Hill Education
Starting Out with Python (4th Edition)
Starting Out with Python (4th Edition)
Computer Science
ISBN:
9780134444321
Author:
Tony Gaddis
Publisher:
PEARSON
Digital Fundamentals (11th Edition)
Digital Fundamentals (11th Edition)
Computer Science
ISBN:
9780132737968
Author:
Thomas L. Floyd
Publisher:
PEARSON
C How to Program (8th Edition)
C How to Program (8th Edition)
Computer Science
ISBN:
9780133976892
Author:
Paul J. Deitel, Harvey Deitel
Publisher:
PEARSON
Database Systems: Design, Implementation, & Manag…
Database Systems: Design, Implementation, & Manag…
Computer Science
ISBN:
9781337627900
Author:
Carlos Coronel, Steven Morris
Publisher:
Cengage Learning
Programmable Logic Controllers
Programmable Logic Controllers
Computer Science
ISBN:
9780073373843
Author:
Frank D. Petruzella
Publisher:
McGraw-Hill Education