A Decision Tree In this question we investigate whether students will pass or fail CS 189 based on whether or not they studied, cheated, and slept well before the exam. You are given the following data for five students. There are three features, "Studied," "Slept," and "Cheated." The column "Result" shows the label we want to predict. Student 1 Student 2 Student 3 Student 4 Student 5 (2) Studied Slept Yes No Yes No No Yes Yes Yes Yes Yes Cheated No Yes No Yes No Result Passed Failed Failed Failed Passed (1) What is the entropy H(Result) at the root node? (There is no need to compute the exact number; you may write it as an arithmetic expression.) Draw the decision tree where every split maximizes the information gain. (An actual drawing, please; a written description does not suffice.) Do not perform a split on a pure leaf or if the split will produce an empty child; otherwise, split. Explain (with numbers) why you chose the splits you chose.
A Decision Tree In this question we investigate whether students will pass or fail CS 189 based on whether or not they studied, cheated, and slept well before the exam. You are given the following data for five students. There are three features, "Studied," "Slept," and "Cheated." The column "Result" shows the label we want to predict. Student 1 Student 2 Student 3 Student 4 Student 5 (2) Studied Slept Yes No Yes No No Yes Yes Yes Yes Yes Cheated No Yes No Yes No Result Passed Failed Failed Failed Passed (1) What is the entropy H(Result) at the root node? (There is no need to compute the exact number; you may write it as an arithmetic expression.) Draw the decision tree where every split maximizes the information gain. (An actual drawing, please; a written description does not suffice.) Do not perform a split on a pure leaf or if the split will produce an empty child; otherwise, split. Explain (with numbers) why you chose the splits you chose.
Related questions
Question
Expert Solution
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
This is a popular solution!
Step 1: Introduction:
VIEWStep 2: (1).Calculating Entropy at the Root Node:
VIEWStep 3: (2).Drawing the Decision Tree:
VIEWStep 4: compute and illustrate the decision tree:
VIEWStep 5: Let's calculate the information gain for each feature:
VIEWStep 6: draw the decision tree based on this information:
VIEWStep 7: Explanation:
VIEWSolution
VIEWTrending now
This is a popular solution!
Step by step
Solved in 8 steps with 25 images