Modify the code below for the baseline CNN (TODO: #1) We are going to first construct the CNN model by filling out the correct dimensions of the CNN layers. Please choose the appropriate stride and padding to get the image sizes at the respective layers correct. The baseline CNN (TODO: #1) will have the following layers: conv1: 3x3 convolution to obtain 16 features of size 32x32 RELU activation pool: MaxPool2d layer with a kernel size=2, stride=2 and padding=0 to obtain 16 features of size 16x16 conv2: 3x3 convolution to obtain 32 features of size 16x16 RELU activation pool: MaxPool2d to obtain 32 features of size 8x8 conv3: 3x3 convolution to obtain 32 features of size 6x6 RELU activation linear1: Linear layer with 512 features in the output. Choose the input feature size carefully. You should reshape the RELU activation linear2: Linear layer with 128 features in the output. RELU activation linear 3: Linear layer with appropriate # features in the output. Choose the input feature size carefully. Code below: import torch.nn as nn import torch.nn.functional as F # define your CNN here, e.g. activation function: F.sigmoid or F.relu class Net(nn.Module): def __init__(self): super(Net, self).__init__() # YOUR CODE HERE def forward(self, x): # YOUR CODE HERE return x
Modify the code below for the baseline CNN (TODO: #1) We are going to first construct the CNN model by filling out the correct dimensions of the CNN layers. Please choose the appropriate stride and padding to get the image sizes at the respective layers correct. The baseline CNN (TODO: #1) will have the following layers: conv1: 3x3 convolution to obtain 16 features of size 32x32 RELU activation pool: MaxPool2d layer with a kernel size=2, stride=2 and padding=0 to obtain 16 features of size 16x16 conv2: 3x3 convolution to obtain 32 features of size 16x16 RELU activation pool: MaxPool2d to obtain 32 features of size 8x8 conv3: 3x3 convolution to obtain 32 features of size 6x6 RELU activation linear1: Linear layer with 512 features in the output. Choose the input feature size carefully. You should reshape the RELU activation linear2: Linear layer with 128 features in the output. RELU activation linear 3: Linear layer with appropriate # features in the output. Choose the input feature size carefully. Code below: import torch.nn as nn import torch.nn.functional as F # define your CNN here, e.g. activation function: F.sigmoid or F.relu class Net(nn.Module): def __init__(self): super(Net, self).__init__() # YOUR CODE HERE def forward(self, x): # YOUR CODE HERE return x
Database System Concepts
7th Edition
ISBN:9780078022159
Author:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Chapter1: Introduction
Section: Chapter Questions
Problem 1PE
Related questions
Question
100%
Modify the code below for the baseline CNN (TODO: #1)
We are going to first construct the CNN model by filling out the correct dimensions of the CNN layers. Please choose the appropriate stride and padding to get the image sizes at the respective layers correct. The baseline CNN (TODO: #1) will have the following layers:
- conv1: 3x3 convolution to obtain 16 features of size 32x32
- RELU activation
- pool: MaxPool2d layer with a kernel size=2, stride=2 and padding=0 to obtain 16 features of size 16x16
- conv2: 3x3 convolution to obtain 32 features of size 16x16
- RELU activation
- pool: MaxPool2d to obtain 32 features of size 8x8
- conv3: 3x3 convolution to obtain 32 features of size 6x6
- RELU activation
- linear1: Linear layer with 512 features in the output. Choose the input feature size carefully. You should reshape the
- RELU activation
- linear2: Linear layer with 128 features in the output.
- RELU activation
- linear 3: Linear layer with appropriate # features in the output. Choose the input feature size carefully.
Code below:
import torch.nn as nn
import torch.nn.functional as F
# define your CNN here, e.g. activation function: F.sigmoid or F.relu
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
# YOUR CODE HERE
def forward(self, x):
# YOUR CODE HERE
return x
Expert Solution
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
This is a popular solution!
Trending now
This is a popular solution!
Step by step
Solved in 3 steps with 1 images
Knowledge Booster
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, computer-science and related others by exploring similar questions and additional content below.Recommended textbooks for you
Database System Concepts
Computer Science
ISBN:
9780078022159
Author:
Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:
McGraw-Hill Education
Starting Out with Python (4th Edition)
Computer Science
ISBN:
9780134444321
Author:
Tony Gaddis
Publisher:
PEARSON
Digital Fundamentals (11th Edition)
Computer Science
ISBN:
9780132737968
Author:
Thomas L. Floyd
Publisher:
PEARSON
Database System Concepts
Computer Science
ISBN:
9780078022159
Author:
Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:
McGraw-Hill Education
Starting Out with Python (4th Edition)
Computer Science
ISBN:
9780134444321
Author:
Tony Gaddis
Publisher:
PEARSON
Digital Fundamentals (11th Edition)
Computer Science
ISBN:
9780132737968
Author:
Thomas L. Floyd
Publisher:
PEARSON
C How to Program (8th Edition)
Computer Science
ISBN:
9780133976892
Author:
Paul J. Deitel, Harvey Deitel
Publisher:
PEARSON
Database Systems: Design, Implementation, & Manag…
Computer Science
ISBN:
9781337627900
Author:
Carlos Coronel, Steven Morris
Publisher:
Cengage Learning
Programmable Logic Controllers
Computer Science
ISBN:
9780073373843
Author:
Frank D. Petruzella
Publisher:
McGraw-Hill Education