single perceptron
TRUE OF FALSE
1) A single perceptron can compute the XOR function. [ ]
2) A single Threshold-Logic Unit can realize the AND function. [ ]
3) A perceptron is guaranteed to perfectly learn a given linearly separable function within a finite number of training steps. [ ]
4) The more hidden-layer units a BPN has, the better it can predict desired outputs for new inputs that it was not trained with. [ ]
5) A three-layer BPN with 5 neurons in each layer has a total of 50 connections and 50 weights. [ ]
6) The backpropagation learning
7) An epoch is when all of the data in the training set is presented to the neural network once. [ ]
8) Training set is a set of pairs of output patterns with corresponding input patterns. [ ]
9) ANN’s “designed to detect, and respond to, the presence of features in an input pattern vector that is presented as a dynamic pattern to the network.” [ ]
10) In backpropagation learning, we should start with a small learning parameter y and slowly increase it during the learning process. [ ]
11) The line equation for the following (w1 = -0.4, w2 = -0.5, q= -0.3) is x2 = -0.4/-0.5 x + -0.3/-0.5 . [ ]
12) Delta Rule is capable of training all the weights in multilayer nets with no a priori knowledge of the training set. [ ]
13) Backpropagation algorithm is a generalized delta rule. [ ]
14) The sigmoid function uses the net amount of excitation as its argument. [ ]
15) The backpropagation algorithm is used to find the global minimum of the error function. [ ]
16) Normalization fumction is (Xi – min(X) / (Max (X) –Min(X)) [ ]
17) Neurons in the hidden layer can’t be observed through the input/output behaviour of the network. [ ]
18) Multi-layer feed-forward networks can learn any function provided they have enough units and time to learn [ ]
Trending now
This is a popular solution!
Step by step
Solved in 2 steps