Figure PL17 shows the signal-flow graph of a recurrent network made up of two neu- rons. Write the nonlinear difference equation that defines the evolution of x,(n) or that of x(n). These two variables define the outputs of the top and bottom neurons, respec- tively. What is the order of this equation?

Computer Networking: A Top-Down Approach (7th Edition)
7th Edition
ISBN:9780133594140
Author:James Kurose, Keith Ross
Publisher:James Kurose, Keith Ross
Chapter1: Computer Networks And The Internet
Section: Chapter Questions
Problem R1RQ: What is the difference between a host and an end system? List several different types of end...
icon
Related questions
Question
100%
70
Chapter 1
Introduction
-2
2
6
FIGURE P1.13
1.14 The network described in Fig. P1.13 has no biases. Suppose that biases equal to -1 and
+1 are applied to the top and bottom neurons of the first hidden layer, and biases equal
to +1 and -2 are applied to the top and bottom neurons of the second hidden layer.
Write the new form of the input-output mapping defined by the network.
1.15 Consider a multilayer feedforward network, all the neurons of which operate in their lin-
ear regions Justify the statement that such a network is equivalent to a single-layer feed-
forward network.
Construct a fully recurrent network with 5 neurons, but with no self-feedback.
1.17 Figure PL.17 shows the signal-flow graph of a recurrent network made up of two neu-
ons Write the nonlinear difference equation that defines the evolution of x,(n) or that
of x,(n). These two variables define the outputs of the top and bottom neurons, respec-
tively. What is the order of this equation?
1.18 Figure P1.18 shows the signal-flow graph of a recurrent network consisting of two neu-
rons with self-feedback. Write the coupled system of two first-order nonlinear difference
equations that describe the operation of the system.
1.19 A recurrent network has 3 source nodes, 2 hidden neurons, and 4 output neurons.
Construct an architectural graph that describes such a network.
FIGURE P1.17
FIGURE P1.18
Problems 71
Knowledge representation
1.20 A useful form of preprocessing is based on the autoregressive (AR) model described by
the difference equation (for real-valued data)
y(n) - wy(n - 1) + Way(n - 2) + ... + wMy(n - M) + v(n)
where y(n) is the model output; v(n) is a sample drawn from a white-noise process of
zero mean and some prescribed variance; w, w, .. 1o, are the AR model coefficients;
and Mis the model order. Show that the use of this model provides two forms of geomet-
ric invariance: (a) scale, and (b) time translation. How could these two invariances be
used in neural networks?
1.21 Let x be an input vector, and s(a, x) be a transformation operator acting on x and
depending on some parameter a. The operator s(a, x) satisfies two requirements:
• s(0, x) =x
• s(a, x) is differentiable with respect to a.
The tangent vector is defined by the partial derivative as(a,x)/da (Simard et al., 1992).
Suppose that x represents an image, and a is a rotation parameter. How would you
compute the tangent vector for the case when a is small? The tangent vector is locally
invariant with respect to rotation of the original image; why?
Transcribed Image Text:70 Chapter 1 Introduction -2 2 6 FIGURE P1.13 1.14 The network described in Fig. P1.13 has no biases. Suppose that biases equal to -1 and +1 are applied to the top and bottom neurons of the first hidden layer, and biases equal to +1 and -2 are applied to the top and bottom neurons of the second hidden layer. Write the new form of the input-output mapping defined by the network. 1.15 Consider a multilayer feedforward network, all the neurons of which operate in their lin- ear regions Justify the statement that such a network is equivalent to a single-layer feed- forward network. Construct a fully recurrent network with 5 neurons, but with no self-feedback. 1.17 Figure PL.17 shows the signal-flow graph of a recurrent network made up of two neu- ons Write the nonlinear difference equation that defines the evolution of x,(n) or that of x,(n). These two variables define the outputs of the top and bottom neurons, respec- tively. What is the order of this equation? 1.18 Figure P1.18 shows the signal-flow graph of a recurrent network consisting of two neu- rons with self-feedback. Write the coupled system of two first-order nonlinear difference equations that describe the operation of the system. 1.19 A recurrent network has 3 source nodes, 2 hidden neurons, and 4 output neurons. Construct an architectural graph that describes such a network. FIGURE P1.17 FIGURE P1.18 Problems 71 Knowledge representation 1.20 A useful form of preprocessing is based on the autoregressive (AR) model described by the difference equation (for real-valued data) y(n) - wy(n - 1) + Way(n - 2) + ... + wMy(n - M) + v(n) where y(n) is the model output; v(n) is a sample drawn from a white-noise process of zero mean and some prescribed variance; w, w, .. 1o, are the AR model coefficients; and Mis the model order. Show that the use of this model provides two forms of geomet- ric invariance: (a) scale, and (b) time translation. How could these two invariances be used in neural networks? 1.21 Let x be an input vector, and s(a, x) be a transformation operator acting on x and depending on some parameter a. The operator s(a, x) satisfies two requirements: • s(0, x) =x • s(a, x) is differentiable with respect to a. The tangent vector is defined by the partial derivative as(a,x)/da (Simard et al., 1992). Suppose that x represents an image, and a is a rotation parameter. How would you compute the tangent vector for the case when a is small? The tangent vector is locally invariant with respect to rotation of the original image; why?
Expert Solution
steps

Step by step

Solved in 2 steps with 2 images

Blurred answer
Recommended textbooks for you
Computer Networking: A Top-Down Approach (7th Edi…
Computer Networking: A Top-Down Approach (7th Edi…
Computer Engineering
ISBN:
9780133594140
Author:
James Kurose, Keith Ross
Publisher:
PEARSON
Computer Organization and Design MIPS Edition, Fi…
Computer Organization and Design MIPS Edition, Fi…
Computer Engineering
ISBN:
9780124077263
Author:
David A. Patterson, John L. Hennessy
Publisher:
Elsevier Science
Network+ Guide to Networks (MindTap Course List)
Network+ Guide to Networks (MindTap Course List)
Computer Engineering
ISBN:
9781337569330
Author:
Jill West, Tamara Dean, Jean Andrews
Publisher:
Cengage Learning
Concepts of Database Management
Concepts of Database Management
Computer Engineering
ISBN:
9781337093422
Author:
Joy L. Starks, Philip J. Pratt, Mary Z. Last
Publisher:
Cengage Learning
Prelude to Programming
Prelude to Programming
Computer Engineering
ISBN:
9780133750423
Author:
VENIT, Stewart
Publisher:
Pearson Education
Sc Business Data Communications and Networking, T…
Sc Business Data Communications and Networking, T…
Computer Engineering
ISBN:
9781119368830
Author:
FITZGERALD
Publisher:
WILEY