cs571_Week_2_Lecture_questions_1

pdf

School

University of Nevada, Las Vegas *

*We aren’t endorsed by this school

Course

571

Subject

Computer Science

Date

Jun 11, 2024

Type

pdf

Pages

8

Uploaded by SuperInternet12237

Report
What s a key difference between biological neural networks and artificial neural networks? Biological neurons receive signals from many other neurons. Biological neural networks are composed of many neurons. Biological neurons have many connections to other neurons. Biological neurons only fire i a threshold is surpassed. @ comect & Correct! Artifcial neural networks use an activation function to determine how they fire. These activation functions will be addressed in a future lecture. What does the XOR (short for “exclusive or”) operator from Boolean logic do? XOR outputs true when neither inputs are true XOR outputs true when only one input is true XOR outputs true only when both inputs are true XOR outputs true when either inputs are true o) © correct Correct! Later, you will be asked postulate why this seemingly simple task cannot be completed by a single-layer neural network.
1. When were artificial neural networks first theorized? @ At neurst networks were firt theorzed i the 15405, O il neurst networks were firt theorzed i the 15705, O i neurst etworks were frt theorzed i the 19505, O el neursl nebworks were firstthearized in the early 20005, © comect Correct! McCulloch and Pitts created a mathematical model of neurons able to calculate nearly any logical or arithmetic function. 2. Approvimately how many neurons are there in the human brain? O -ssthousand O -ssmilion @ -ssbilion O -sstilion @ comeat Correct! Each neuron may be connected to up to 10,000 other neurons. Ifthis were an artifcial neural network, we would need to optimize over hundreds of trillions of parameters. 3. Which algorithm renewed popular interest in artficial neural network research in 19867 O Hillcimbing @ Eschpropsgstion O Gradient descent O Djestr’ssigorthm © comect Correct! This algorithm was created, thanks to David Rumelhart, Geoffrey Hinton, and Ronald Williams, to calculate gradients for gradient descent. 1/1point 1/1point 1/1point
4. What is a major challenge for artificial neural networks? 1/1 point O Artificial neural networks cannot handle inputs of a different scale. O Artificial neural networks only work on specific types of output data. O Artificial neural networks can only make discrete classifications. @ Artificial neural networks are difficult to analyze and debug. ( ) Correct Correct! Because of the vast number of parameters and connections in a neural network, determining howone has modeled a given dataset, or why something may not be working as expected, isnot a straightforward task. The output of the perceptron is a linear combination of what? The sigma vector The input vector The output vector The weight vector (J) Correct Correct! We take the dot product of the transposed weight vector with the input vector to get our perceptron’s output.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
“yBiem 343 4 Buoje Yajeusesed aOUR Se P3|ER} 3q U3 USD SBI SIY | [I9310) Pase (3) wdnoays O nduiayr O seqayl @ sugemayl O uod 1/ ;uoinau je21B0joiq e Jo ploysaiy) ay} o} spuodsaiiod uondasad seaul e J0 Juauodwod PIYM *E syBiam 3521 Bunsnipe Aq 1ndino siuy 121100 pue Jsnfpe ued o 2103 paso> () B uorEauLe3 pus Indul 4¢3 jo npasd suy Suieuoa Jopary ) SyBiam uoR3uLC> 3y pue syndur sy o uonewwnsayy O 14BoM uoEaUO> 243 Aq pa]eds syndurauy o wnwxew ayy ) SuBiom uorauLe) Uy Y sIndur 3 o prposd 10p L @ awnod /3 Z1enba uondsasad sesun @ jondino su siieyM oL T ndino ue s31es2ua8 pue suoP3ULOD Jndul 3ydINW Ul SHE} | 19310) Pase (3) suoinsujokeuewy O voinausBusy @ spompujemauy O suounau jo akery O uod 1/ reuis uondaniad eaur) e s1jeym oy “PIoySR.L UOQEAP® 3453 S8 S128 31 PUS “SUBIEM puB dul 12UR0 3u3 4 30 L3213 3L} O PAILINS ST S PRL) wouey () 3uBom seiq a3 Aq aleos T ey ssjeas ndur uersuco e sy 1 j03uBiam se1q JuetsucH @ Aq pajeas Jndu ue sy B se1q 33 Aq paleas T jondurjustsuca e sy uBiam se1q 33 Aq paeas °T ve 553 JndurusIsUD B SY {uoxdasad Jeaun 21f 03Ul PR1e0dI03UI PIOYSSI} UOREAIE 317 51 MOH
According to the previous lecture, what is the purpose of the bias term? To add nonlinearity To adapt the learing rate To scale the output Towork as the activation threshold Qe B Correct! Instead of comparing the perceptron's output to the threshold, we incorporate it into our linear combination and compare t0 0. This allows the threshold to be another learnable parameter, which will be discussed further in a future lecture. Which element produces the nonlinearity of the nonlinear perceptron? Nonlinear bias Nonlinear activation function Activation function Threshold activation G/) Correct Correct! Common choices of nonlinear activation functions include sigmoid, tanh, and RelU. [N
1. Which element is added to » perceptron to provide non-linarity? 111 point O The threshold O Thebias O Connection to additional perceptrons @ The activation function O comect Correct! The output is now differentiable instead of a binary value. 2. Which statement best describes the output of the activation function? 1/1point O Theoutput is a matrix QO Theoutputisall or nothing. @ The outputis continuous O The outputis avector © correct Correct! This is necessary for differentiation. 3. Which activation method most closely resembles biological neurons? 1/1point @ Threshold activation Q Sinefunction O RelU activation Q sigmoid function @ correet Correct! However, this is non-differentiable, which would present problems in techniques that will be discussed shortly. 4. Why are nonlinear activation functions needed in order to purposefully construct a network of perceptrons with 2112 point mltipe layers? O Nenlinear actvston functions more closely mode! iological neursl networks. (O The nonlinear actvtions sllow for s wider range o autputs @ Vithout nonlinesr sctvations, any number of inear combinstion syers can be performed a5 s single inar combinstion. (O Actvation functions re not necessary for constructing a nebwork of prceptrons with multple syers. @ comeat Comrect There would be no pointin msking s sequence to perform,
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
Consider an example neural network that has X input neurons, 3 hidden neuronsin the only hidden layer, and a single output neuron. The activated values of the neurons in the hidden layer are L7, 3,.6] and the weights connecting them to the output neuron are 8, 2,31 The output neuron's bias has a weight of -5 with input 1. Assuming that there is no activation on the output neuron, what s ts output value? 18 @ comect Correct! This value s the dot product of the inputs with their connection weights. 9,-1,7]. The output neuron's bias has a weight of 6 with input 1. Assuming that there is no activation on the output neuron, what s the output value of the output neuron? Consider an example neural network that has X input neurons, 3 hidden neurons in the only hidden layer, and a single output neuron. The activated values of the neurons in the hidden layer are [.2, 5, .4] and the weights connecting them to the output neuron are 2 @ comeet Correct! This value s the dot product of the inputs with their connection weights.
1. What is typically a requirement for neural networks? O A neural network must have at least 3 layers. @ There is a weight parameter for every neuron connection in the network. O A neural network must have the same number of neurons in each layer. O There is a bias parameter for every connection from neurons of one layer to the neurons of the next layer. (J)' Correct Correct! This often gives a very large number of parameters to optimize. [ . Which term best describes the process of taking an input vector through a neural network model to produce an output vector? O Function pass (O Backward pass @ Forward pass O Mapping pass (»’) Correct Correct! With input to output, we are making a forward pass through the model. . Which characteristic is typically necessary for neural networks? O A neural network must have the same number of neurons in each layer. There s a bias parameter for every neuron. O There is only a single weight parameter for every neuron in the network. O A neural network must have at least 3 layers. (J)- Correct Correct! This bias acts in a way in place of the activation threshold. . Which statement best describes a standard forward pass through a neural network? O We do not understand how the forward pass maps input to output. (O Aforward pass is nondeterministic. @ Aforward pass is deterministic. O Aforward pass is a stochastic process. 6’) Correct Correct! A forward pass with the same input will always produce the same output. 1/1 point 1/1 point 1/1 point 1/1 point