Implement Support Vector Machine for the MNIST hand written digits recognition in Python using available subroutines for solving a quadratic optimization problem. 1) Use the provided discretized hand written digits data sets (both training and testing, scaled between 0 and 1, fractional values are fine). 2) Formulate the dual soft margin SVM in Python by specifying all the required matrices and vectors for the quadratic optimization problem. 3) Train the dual soft-margin SVM (the one that incorporates a non-separable case) to classify 2s vs. 5s only. Select 500 training data points (250 for 2s, 250 for 5s). Use the dual radial basis function machine y = 0.05. Use C = 100 as the penalty parameter. Increase if necessary. 4) Check optimal parameter "b" using two different i0. Show the values. 5) Calculate the error/accuracy for testing examples using the decision rule Use your SVM code above for the SVD based dimensionality reduction experiment. Use 250 "twos and 250 "fives" (500 training digits in total). Implement an SVD based dimensionality reduction. Use 500 digits from each class for testing (1000 in total). Reduce the dimensionality until the accuracy falls below 90%. What is the smallest dimension that keeps the accuracy above 90%? Assume we have train2, train5, test2, test5 file sepearetely

icon
Related questions
Question
Implement Support Vector Machine for the MNIST hand written digits recognition in Python
using available subroutines for solving a quadratic optimization problem.
1) Use the provided discretized hand written digits data sets (both training and testing,
scaled between 0 and 1, fractional values are fine).
2) Formulate the dual soft margin SVM in Python by specifying all the required matrices and
vectors for the quadratic optimization problem.
3) Train the dual soft-margin SVM (the one that incorporates a non-separable case) to
classify 2s vs. 5s only. Select 500 training data points (250 for 2s, 250 for 5s). Use the dual
radial basis function machine y = 0.05. Use C = 100 as the penalty parameter. Increase if
necessary.
4) Check optimal parameter "b" using two different i0. Show the values.
5) Calculate the error/accuracy for testing examples using the decision rule
Use your SVM code above for the SVD based dimensionality reduction experiment. Use
250 "twos and 250 "fives" (500 training digits in total). Implement an SVD based
dimensionality reduction. Use 500 digits from each class for testing (1000 in total). Reduce
the dimensionality until the accuracy falls below 90%. What is the smallest dimension that
keeps the accuracy above 90%? Assume we have train2, train5, test2, test5 file
sepearetely
Transcribed Image Text:Implement Support Vector Machine for the MNIST hand written digits recognition in Python using available subroutines for solving a quadratic optimization problem. 1) Use the provided discretized hand written digits data sets (both training and testing, scaled between 0 and 1, fractional values are fine). 2) Formulate the dual soft margin SVM in Python by specifying all the required matrices and vectors for the quadratic optimization problem. 3) Train the dual soft-margin SVM (the one that incorporates a non-separable case) to classify 2s vs. 5s only. Select 500 training data points (250 for 2s, 250 for 5s). Use the dual radial basis function machine y = 0.05. Use C = 100 as the penalty parameter. Increase if necessary. 4) Check optimal parameter "b" using two different i0. Show the values. 5) Calculate the error/accuracy for testing examples using the decision rule Use your SVM code above for the SVD based dimensionality reduction experiment. Use 250 "twos and 250 "fives" (500 training digits in total). Implement an SVD based dimensionality reduction. Use 500 digits from each class for testing (1000 in total). Reduce the dimensionality until the accuracy falls below 90%. What is the smallest dimension that keeps the accuracy above 90%? Assume we have train2, train5, test2, test5 file sepearetely
Expert Solution
steps

Step by step

Solved in 2 steps

Blurred answer