(c) A fully connected neural network is given below. ● x1 x2 1 Note that: i. ii. iii. w11 w12 w21, w31 w22 w32 al กาอย 1 h12 h21 a2h22 h11 h31 y1 81: y2 The ReLU activation function is defined as follows: h32 For an input x, ReLU(x) = max(0, x) The Softmax function is defined as follows: S Given the inputs, xi, i=1, ..., n; the outputs are as follows: exi = j=1 Calculate the net inputs a1 and a2. Calculate the ReLU outputs. Calculate y1 and y2. Softmax exj Suppose that the weights of the network are w11=-1.5; w12=-1.0, w21-1.2; w22=1.0; w31=-0.6; w32=1.0 h11=-0.5; h12=-1.0; h21=1.0, h22=1.5; h31= 0.4; h32=0.3 For an input of x1= 1.0 and x2=2.0 z1 z2

Database System Concepts
7th Edition
ISBN:9780078022159
Author:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Chapter1: Introduction
Section: Chapter Questions
Problem 1PE
icon
Related questions
Question
(c) A fully connected neural network is given below.
●
x1
x2
1
Note that:
i.
ii.
iii.
iv.
w11
w21
W31
w12
w22
w32
al
a2
1
ReLU
RELU
h12
h21
h11
h22
h31
h32
The ReLU activation function is defined as follows:
For an input x, ReLU(x) = max(0, x)
The Softmax function is defined as follows:
y1
s (x₂)
y2
=
Given the inputs, xi, i=1, ..., n; the outputs are as follows:
exi
Calculate the net inputs al and a2.
Calculate the ReLU outputs.
Calculate y1 and y2.
Calculate z1 and 22.
Softmax
exj
Suppose that the weights of the network are
w11=-1.5; w12=-1.0, w21-1.2; w22=1.0; w31=-0.6; w32=1.0
h11=-0.5; h12=-1.0; h21=1.0, h22=1.5; h31= 0.4; h32=0.3
For an input of x1= 1.0 and x2=2.0
z1
z2
Transcribed Image Text:(c) A fully connected neural network is given below. ● x1 x2 1 Note that: i. ii. iii. iv. w11 w21 W31 w12 w22 w32 al a2 1 ReLU RELU h12 h21 h11 h22 h31 h32 The ReLU activation function is defined as follows: For an input x, ReLU(x) = max(0, x) The Softmax function is defined as follows: y1 s (x₂) y2 = Given the inputs, xi, i=1, ..., n; the outputs are as follows: exi Calculate the net inputs al and a2. Calculate the ReLU outputs. Calculate y1 and y2. Calculate z1 and 22. Softmax exj Suppose that the weights of the network are w11=-1.5; w12=-1.0, w21-1.2; w22=1.0; w31=-0.6; w32=1.0 h11=-0.5; h12=-1.0; h21=1.0, h22=1.5; h31= 0.4; h32=0.3 For an input of x1= 1.0 and x2=2.0 z1 z2
Expert Solution
steps

Step by step

Solved in 4 steps

Blurred answer
Knowledge Booster
Use of XOR function
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, computer-science and related others by exploring similar questions and additional content below.
Similar questions
  • SEE MORE QUESTIONS
Recommended textbooks for you
Database System Concepts
Database System Concepts
Computer Science
ISBN:
9780078022159
Author:
Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:
McGraw-Hill Education
Starting Out with Python (4th Edition)
Starting Out with Python (4th Edition)
Computer Science
ISBN:
9780134444321
Author:
Tony Gaddis
Publisher:
PEARSON
Digital Fundamentals (11th Edition)
Digital Fundamentals (11th Edition)
Computer Science
ISBN:
9780132737968
Author:
Thomas L. Floyd
Publisher:
PEARSON
C How to Program (8th Edition)
C How to Program (8th Edition)
Computer Science
ISBN:
9780133976892
Author:
Paul J. Deitel, Harvey Deitel
Publisher:
PEARSON
Database Systems: Design, Implementation, & Manag…
Database Systems: Design, Implementation, & Manag…
Computer Science
ISBN:
9781337627900
Author:
Carlos Coronel, Steven Morris
Publisher:
Cengage Learning
Programmable Logic Controllers
Programmable Logic Controllers
Computer Science
ISBN:
9780073373843
Author:
Frank D. Petruzella
Publisher:
McGraw-Hill Education