2 You are given the count matrix in the table 4 for the attribute "Previous GPA". The class label is whether a student will pass (C1) or fail (CO) a class. We want to use the attribute "Previous GPA" asour splitting attribute in an inner node of a decision tree with a binary test condition. Which is the best way to partition the attribute values into two groups? Form the count matrices for the different partitions and compute the classification error. Explain your reasoning while considering the metric of classification error to evaluate the different partitions. pass (C1) fail (CO) Bad 1 4 Table 4 Pre ious GPA Fair 6 4 Good Very good 9 3 3 0

Database System Concepts
7th Edition
ISBN:9780078022159
Author:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Chapter1: Introduction
Section: Chapter Questions
Problem 1PE
icon
Related questions
Question
2
You are given the count matrix in the table 4 for the attribute "Previous GPA". The class label is whether
a student will pass (C1) or fail (CO) a class. We want to use the attribute "Previous GPA" asour splitting
attribute in an inner node of a decision tree with a binary test condition. Which is the best way to partition
the attribute values into two groups? Form the count matrices for the different partitions and compute the
classification error. Explain your reasoning while considering the metric of classification error to evaluate
the different partitions.
0
3.1
+
0
3
Consider the decision trees shown in Figure 1. The decision tree in 1b is a pruned version of the original
decision tree 1a. The training and test sets are shown in table 5. For every combination of values for
attributes A and B, we have the number of instances in our dataset that have a positive or negative label.
B
2
2
1
0
1
0
1
0
1
0
+
5
A
3
22
7
1
2
6
Bad
1
pass (C1)
fail (CO) 4
B
(a) Decision Tree 1 (DT1)
1
3
4
2
Table 4
7
32
5
4
Training set
A
B # of (+) instances #of (-) instances
0
0
1
1
Pre ious GPA
Fair Good Very good
B
6
4
1
+
Figure 1
Table 5
9
3
3
+
3
0
4
3
6
3
5
2
0
0
+
A
1
B
2
(b) Decision Tree 2 (DT2)
Test set
# of (+) instances # of (-) instances
1
1
3
15
2
5
Estimate the generalization error rate of both trees shown in Figure 1 (DT1, DT2) using the optimistic
approach and the pessimistic approach. To account for model complexity with the pessimistic approach,
use a penalty value of 2 = 2 for each leaf node.
Transcribed Image Text:2 You are given the count matrix in the table 4 for the attribute "Previous GPA". The class label is whether a student will pass (C1) or fail (CO) a class. We want to use the attribute "Previous GPA" asour splitting attribute in an inner node of a decision tree with a binary test condition. Which is the best way to partition the attribute values into two groups? Form the count matrices for the different partitions and compute the classification error. Explain your reasoning while considering the metric of classification error to evaluate the different partitions. 0 3.1 + 0 3 Consider the decision trees shown in Figure 1. The decision tree in 1b is a pruned version of the original decision tree 1a. The training and test sets are shown in table 5. For every combination of values for attributes A and B, we have the number of instances in our dataset that have a positive or negative label. B 2 2 1 0 1 0 1 0 1 0 + 5 A 3 22 7 1 2 6 Bad 1 pass (C1) fail (CO) 4 B (a) Decision Tree 1 (DT1) 1 3 4 2 Table 4 7 32 5 4 Training set A B # of (+) instances #of (-) instances 0 0 1 1 Pre ious GPA Fair Good Very good B 6 4 1 + Figure 1 Table 5 9 3 3 + 3 0 4 3 6 3 5 2 0 0 + A 1 B 2 (b) Decision Tree 2 (DT2) Test set # of (+) instances # of (-) instances 1 1 3 15 2 5 Estimate the generalization error rate of both trees shown in Figure 1 (DT1, DT2) using the optimistic approach and the pessimistic approach. To account for model complexity with the pessimistic approach, use a penalty value of 2 = 2 for each leaf node.
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 3 steps

Blurred answer
Knowledge Booster
Use of XOR function
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, computer-science and related others by exploring similar questions and additional content below.
Similar questions
Recommended textbooks for you
Database System Concepts
Database System Concepts
Computer Science
ISBN:
9780078022159
Author:
Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:
McGraw-Hill Education
Starting Out with Python (4th Edition)
Starting Out with Python (4th Edition)
Computer Science
ISBN:
9780134444321
Author:
Tony Gaddis
Publisher:
PEARSON
Digital Fundamentals (11th Edition)
Digital Fundamentals (11th Edition)
Computer Science
ISBN:
9780132737968
Author:
Thomas L. Floyd
Publisher:
PEARSON
C How to Program (8th Edition)
C How to Program (8th Edition)
Computer Science
ISBN:
9780133976892
Author:
Paul J. Deitel, Harvey Deitel
Publisher:
PEARSON
Database Systems: Design, Implementation, & Manag…
Database Systems: Design, Implementation, & Manag…
Computer Science
ISBN:
9781337627900
Author:
Carlos Coronel, Steven Morris
Publisher:
Cengage Learning
Programmable Logic Controllers
Programmable Logic Controllers
Computer Science
ISBN:
9780073373843
Author:
Frank D. Petruzella
Publisher:
McGraw-Hill Education