please code in python, I really need to know this works.. kindly help me out... please use can use any sample dataset for demo. Buil a Grid_Search_NN_model that has the same architecture as the MLP model from Question 4. Use grid search to tune two hyperparameters: The number of neurons on the hidden layers of your MLP model (find the best number among 8, 16, 32). Each hidden layer should have the same number of neurons/nodes, so only one hyperparameter is needed to tune the number of neurons. Learning rate of the SGD optimizer (find the best value among the two numbers 0.01 and 0.1). Implement grid search to identify optimal hyperparameter values, and print out the best hyperparameter values and the best cross-validation accuracy.You can use 3-fold GridSearchCV and KerasClassifier functions on the standarized training set to do this.  Build the optimized MLP model on the training set by passing the detected best hyperparameter values to the Grid_Search_NN_model. Print out the precision, recall, and F1-score of the optimized MLP model on the test set.   MLP code is given below: mport time import numpy as np import pandas as pd from sklearn.neural_network import MLPClassifier from sklearn.metrics import classification_report, confusion_matrix from sklearn.preprocessing import StandardScaler # Load the data data = pd.read_csv('sample_data.csv') X = data.iloc[:, :-1].values y = data.iloc[:, -1].values # Split the data into train and test sets from sklearn.model_selection import train_test_split train_data, test_data, train_labels, test_labels = train_test_split(X, y, test_size=0.2, random_state=0) # Standardize the data sc = StandardScaler() train_datascaled = sc.fit_transform(train_data) test_datascaled = sc.transform(test_data) # Train the MLP model mlp = MLPClassifier(hidden_layer_sizes=(8,8), activation='tanh', solver='sgd', learning_rate_init=0.1, max_iter=10) start_time = time.time() mlp.fit(train_datascaled, train_labels) end_time = time.time() exec_time = (end_time - start_time) * 1000 print("Execution time: {:.2f} ms".format(exec_time)) # Predict the test set labels predictions = mlp.predict(test_datascaled) # Print classification report and confusion matrix print(confusion_matrix(test_labels, predictions)) print(classification_report(test_labels, predictions))

Database System Concepts
7th Edition
ISBN:9780078022159
Author:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Chapter1: Introduction
Section: Chapter Questions
Problem 1PE
icon
Related questions
Question

please code in python, I really need to know this works.. kindly help me out... please

use can use any sample dataset for demo.

Buil a Grid_Search_NN_model that has the same architecture as the MLP model from Question 4. Use grid search to tune two hyperparameters:

  • The number of neurons on the hidden layers of your MLP model (find the best number among 8, 16, 32). Each hidden layer should have the same number of neurons/nodes, so only one hyperparameter is needed to tune the number of neurons.
  • Learning rate of the SGD optimizer (find the best value among the two numbers 0.01 and 0.1).

Implement grid search to identify optimal hyperparameter values, and print out the best hyperparameter values and the best cross-validation accuracy.You can use 3-fold GridSearchCV and KerasClassifier functions on the standarized training set to do this.  Build the optimized MLP model on the training set by passing the detected best hyperparameter values to the Grid_Search_NN_model. Print out the precision, recall, and F1-score of the optimized MLP model on the test set.

 

MLP code is given below:

mport time
import numpy as np
import pandas as pd
from sklearn.neural_network import MLPClassifier
from sklearn.metrics import classification_report, confusion_matrix
from sklearn.preprocessing import StandardScaler

# Load the data
data = pd.read_csv('sample_data.csv')
X = data.iloc[:, :-1].values
y = data.iloc[:, -1].values

# Split the data into train and test sets
from sklearn.model_selection import train_test_split
train_data, test_data, train_labels, test_labels = train_test_split(X, y, test_size=0.2, random_state=0)

# Standardize the data
sc = StandardScaler()
train_datascaled = sc.fit_transform(train_data)
test_datascaled = sc.transform(test_data)

# Train the MLP model
mlp = MLPClassifier(hidden_layer_sizes=(8,8), activation='tanh', solver='sgd', learning_rate_init=0.1, max_iter=10)
start_time = time.time()
mlp.fit(train_datascaled, train_labels)
end_time = time.time()
exec_time = (end_time - start_time) * 1000
print("Execution time: {:.2f} ms".format(exec_time))

# Predict the test set labels
predictions = mlp.predict(test_datascaled)

# Print classification report and confusion matrix
print(confusion_matrix(test_labels, predictions))
print(classification_report(test_labels, predictions))

Expert Solution
steps

Step by step

Solved in 3 steps

Blurred answer
Knowledge Booster
Temporal Difference Learning
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, computer-science and related others by exploring similar questions and additional content below.
Similar questions
Recommended textbooks for you
Database System Concepts
Database System Concepts
Computer Science
ISBN:
9780078022159
Author:
Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:
McGraw-Hill Education
Starting Out with Python (4th Edition)
Starting Out with Python (4th Edition)
Computer Science
ISBN:
9780134444321
Author:
Tony Gaddis
Publisher:
PEARSON
Digital Fundamentals (11th Edition)
Digital Fundamentals (11th Edition)
Computer Science
ISBN:
9780132737968
Author:
Thomas L. Floyd
Publisher:
PEARSON
C How to Program (8th Edition)
C How to Program (8th Edition)
Computer Science
ISBN:
9780133976892
Author:
Paul J. Deitel, Harvey Deitel
Publisher:
PEARSON
Database Systems: Design, Implementation, & Manag…
Database Systems: Design, Implementation, & Manag…
Computer Science
ISBN:
9781337627900
Author:
Carlos Coronel, Steven Morris
Publisher:
Cengage Learning
Programmable Logic Controllers
Programmable Logic Controllers
Computer Science
ISBN:
9780073373843
Author:
Frank D. Petruzella
Publisher:
McGraw-Hill Education