Consider the hypothesis space defined over instances shown below, we characterize eac hypothesis (apple taste) by 4-tuples. Please hand trace the ID3 classifier to build a decisio tree, then predict the target value Taste=Sweet/Tart for the following instances: a) b) c) d) e) Now suppose the actual taste of the five apples above are actually "Sweet, Sweet, Sweet, Tart, Tart", what is the accuracy of the decision tree? Please show all the steps and include the corresponding confusion matrix for accuracy calculation. (10pts) Melon Color 1 Red 2 Red 3 Yellow 4 Yellow 5 Yellow 6 Yellow 7 Green 8 Red 9 Red 10 Yellow Crispiness Spot Fragrant None Yes None No None Yes Some No None No Some Yes Some No None Yes None No None Yes High High High Mid Low Mid Low Low Low Mid Taste Sweet Sweet Sweet Tart Sweet Tart Sweet Tart Sweet Sweet

Database System Concepts
7th Edition
ISBN:9780078022159
Author:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Chapter1: Introduction
Section: Chapter Questions
Problem 1PE
icon
Related questions
Question

can you please give correct and proper solution.

Consider the hypothesis space defined over instances shown below, we characterize each
hypothesis (apple taste) by 4-tuples. Please hand trace the ID3 classifier to build a decision
tree, then predict the target value Taste=Sweet/Tart for the following instances:
a) <Red, High, Some, No>
b) <Red, Low, Some, Yes>
c) <Yellow, Low, Some, No>
d) <Green, High, None, No>
e) <Green, Mid, Some, Yes>
Now suppose the actual taste of the five apples above are actually "Sweet, Sweet,
Sweet, Tart, Tart", what is the accuracy of the decision tree? Please show all the steps
and include the corresponding confusion matrix for accuracy calculation. (10pts)
Melon
Color
1 Red
2
Red
3
Yellow
4
Yellow
5 Yellow
6 Yellow
7 Green
8
Red
9
Red
10 Yellow
(Please give me a correct solution)
Crispiness Spot
High
High
High
Mid
Low
Mid
Low
Low
Low
Mid
Fragrant
None Yes
None No
None Yes
Some No
None No
Some Yes
Some No
None Yes
None
No
None Yes
Taste
Sweet
Sweet
Sweet
Tart
Sweet
Tart
Sweet
Tart
Sweet
Sweet
Transcribed Image Text:Consider the hypothesis space defined over instances shown below, we characterize each hypothesis (apple taste) by 4-tuples. Please hand trace the ID3 classifier to build a decision tree, then predict the target value Taste=Sweet/Tart for the following instances: a) <Red, High, Some, No> b) <Red, Low, Some, Yes> c) <Yellow, Low, Some, No> d) <Green, High, None, No> e) <Green, Mid, Some, Yes> Now suppose the actual taste of the five apples above are actually "Sweet, Sweet, Sweet, Tart, Tart", what is the accuracy of the decision tree? Please show all the steps and include the corresponding confusion matrix for accuracy calculation. (10pts) Melon Color 1 Red 2 Red 3 Yellow 4 Yellow 5 Yellow 6 Yellow 7 Green 8 Red 9 Red 10 Yellow (Please give me a correct solution) Crispiness Spot High High High Mid Low Mid Low Low Low Mid Fragrant None Yes None No None Yes Some No None No Some Yes Some No None Yes None No None Yes Taste Sweet Sweet Sweet Tart Sweet Tart Sweet Tart Sweet Sweet
Expert Solution
Step 1
Answers:-
1st step:-
the formulas -
Entropy (class)
 
-P/P+Nlog2(P/P+N)-N/P+N log2(N/P+N)
 
for each attribute
 
I(Pi,Ni)=-P/P+N log2(P/P+N)-N/P+N log2(N/P+N)
 
Entrophy (attribute)
 
= ΣPi+Ni/P+N(I(Pi,Ni))
 
Gain
 
=Entrophy - Entrophy(attribute) calss)
 
 
 
The Entropy of class - P(Yes) = 7, P(No) = 5
 
The Entropy = -7/12log2 (7/12) -5/12 log2(5/12) = 0.453 +0.526 = 0.979
 
Untitled - Notepad
 
File Edit Format
 
View
 
Help
 
Outlook
 
P
 
N
 
I(P,N)
 
Rain 3 4 0.984
 
Sunny
2
 
0
 
0
 
Overcast
 
2
 
1 0.918
 
Wind
 
Weak
 
Strong
 
PN
 
I(P,N)
 
4
 
3
 
0.984
 
3 2 0.970
 
Day
 
PNI(P,N)
 
Weekend 2
 
4 0.918
 
Weekday 5 1 0.650
 
The Entropy(Outlook) = (7x0.984 + 2x0 + 3x0.918) /
 
12 = 0.8035
 
Entropy(Wind) = (7x0.984 +5x0.97) / 12 = 0.978
 
Entropy(Day) = (6x0.918+ 6x0.65) / 12 = 0.784 Gain(Outlook) = 0.979-0.803 = 0.176
 
Gain (Wind) = 0.979-0.978 0.001 Gain (Day) = 0.979-0.784 0.195
 
Since the  gain of the from Day is maximum hence Day will be
 root node
 
Now for Day = weekend - Entropy of class -
 
P(Yes) = 2, P(No) = 4 Entropy = = 0.918
steps

Step by step

Solved in 2 steps

Blurred answer
Knowledge Booster
Inference
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, computer-science and related others by exploring similar questions and additional content below.
Similar questions
Recommended textbooks for you
Database System Concepts
Database System Concepts
Computer Science
ISBN:
9780078022159
Author:
Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:
McGraw-Hill Education
Starting Out with Python (4th Edition)
Starting Out with Python (4th Edition)
Computer Science
ISBN:
9780134444321
Author:
Tony Gaddis
Publisher:
PEARSON
Digital Fundamentals (11th Edition)
Digital Fundamentals (11th Edition)
Computer Science
ISBN:
9780132737968
Author:
Thomas L. Floyd
Publisher:
PEARSON
C How to Program (8th Edition)
C How to Program (8th Edition)
Computer Science
ISBN:
9780133976892
Author:
Paul J. Deitel, Harvey Deitel
Publisher:
PEARSON
Database Systems: Design, Implementation, & Manag…
Database Systems: Design, Implementation, & Manag…
Computer Science
ISBN:
9781337627900
Author:
Carlos Coronel, Steven Morris
Publisher:
Cengage Learning
Programmable Logic Controllers
Programmable Logic Controllers
Computer Science
ISBN:
9780073373843
Author:
Frank D. Petruzella
Publisher:
McGraw-Hill Education