Problem 2: Naïve Bayes Classification In order to reduce my email load, I decide to use a classifier to decide whether or not I should read an email, or simply file it away instead. To train my model, I obtain the following data set of binary-valued features about each email, including whether I know the author or not, whether the email is long or short, and whether it has any of several key words, along with my final decision about whether to read it ( y = +1 for “read”, y = −1 for “discard”). Help me build the classifier. Know author? Is long? Has “research”? Has “grade”? Has “lottery”? Read? X1 X2 X3 X4 X5 Y 0 0 1 1 0 -1 1 1 0 1 0 -1 0 1 1 1 1 -1 1 1 1 1 0 -1 0 1 0 0 0 -1 1 0 1 1 1 1 0 0 1 0 0 1 1 0 0 0 0 1 1 0 1 1 0 1 1 1 1 1 1 -1
Problem 2: Naïve Bayes Classification
In order to reduce my email load, I decide to use a classifier to decide whether or not I should read an email, or simply file it away instead. To train my model, I obtain the following data set of binary-valued features about each email, including whether I know the author or not, whether the email is long or short, and whether it has any of several key words, along with my final decision about whether to read it ( y = +1 for “read”, y = −1 for “discard”). Help me build the classifier.
Know author? |
Is long? |
Has “research”? |
Has “grade”? |
Has “lottery”? |
Read? |
X1 |
X2 |
X3 |
X4 |
X5 |
Y |
0 |
0 |
1 |
1 |
0 |
-1 |
1 |
1 |
0 |
1 |
0 |
-1 |
0 |
1 |
1 |
1 |
1 |
-1 |
1 |
1 |
1 |
1 |
0 |
-1 |
0 |
1 |
0 |
0 |
0 |
-1 |
1 |
0 |
1 |
1 |
1 |
1 |
0 |
0 |
1 |
0 |
0 |
1 |
1 |
0 |
0 |
0 |
0 |
1 |
1 |
0 |
1 |
1 |
0 |
1 |
1 |
1 |
1 |
1 |
1 |
-1 |
In the case of any ties, predict class +1. Use naïve Bayes classifier to make the decisions.
Compute all the probabilities necessary for a naïve Bayes classifier, i.e., the prior probability p(Y) for both the classes and all the likelihoods p(Xi | Y), for each class Y and feature Xi.
Trending now
This is a popular solution!
Step by step
Solved in 5 steps with 5 images