I am completely lost for this practice for Machine Learning. There was no solution provided for this. Train a decision tree to predict whether a person makes over $50K a year based on the UCI Adult Census Income dataset. This database contains 48842 instances. Each instance is described in terms of 14 attributes. The last column attribute named “income” is the binary outcome to predict (i.e., ≤ 50K or otherwise). Note there are both numeric and discrete attributes. Cannot allowed to use libraries to such as SciKit to help. Code must be in Python. Tasks: The data set can be downloaded from the website https://www.kaggle.com/ uciml/adult-census-income. After downloading the data set, please split it into training set (70%), validation set (10%) and test s

Computer Networking: A Top-Down Approach (7th Edition)
7th Edition
ISBN:9780133594140
Author:James Kurose, Keith Ross
Publisher:James Kurose, Keith Ross
Chapter1: Computer Networks And The Internet
Section: Chapter Questions
Problem R1RQ: What is the difference between a host and an end system? List several different types of end...
icon
Related questions
Question
100%

I am completely lost for this practice for Machine Learning. There was no solution provided for this.

Train a decision tree to predict whether a person makes over $50K a year based on the UCI Adult Census Income dataset. This database contains 48842 instances. Each instance is described in terms of 14 attributes. The last column attribute named “income” is the binary outcome to predict (i.e., ≤ 50K or otherwise). Note there are both numeric and discrete attributes.

Cannot allowed to use libraries to such as SciKit to help. Code must be in Python.

Tasks:

  • The data set can be downloaded from the website https://www.kaggle.com/ uciml/adult-census-income. After downloading the data set, please split it into training set (70%), validation set (10%) and test set (20%).
  • Implement the C4.5 decision tree algorithm. Your implementation should let user determine when to stop the recursive splitting.
  • Run your algorithm on the training set to learn a decision tree using the following cut-off values, i.e. θI = 0.2, 0.4, 0.6, 0.8 in the attached pseudocode and also covered in class. Plot the training error for each cut-off? Hint: you can use training set to grow the tree and use validation set to calculate training error.
  • Test your decision trees on the test set. And plot test errors together with training errors for each tree you train. What is your conclusion?
Generate Tree(X)
If NodeEntropy(X)< 01 /* eq. 9.3
Create leaf labelled by majority class in X
Return
i - SplitAttribute(X)
For each branch of æ;
Find X; falling in branch
Generate Tree(Xi)
SplitAttribute(X)
MinEnt- MAX
For all attributes i = 1, ..., d
%3D
If æ; is discrete with n values
Split X into X1, . . . , Xn by æ¿
e - SplitEntropy(X1, ..., Xn) /* eq. 9.8 */
If e<MinEnt MinEnt
e; bestf +
- i
Else /* æ; is numeric */
For all possible splits
Split X into X1, X2 on æ¡
e-SplitEntropy(X1, X2)
If e<MinEnt MinEnt + e; bestf i
Return bestf
Transcribed Image Text:Generate Tree(X) If NodeEntropy(X)< 01 /* eq. 9.3 Create leaf labelled by majority class in X Return i - SplitAttribute(X) For each branch of æ; Find X; falling in branch Generate Tree(Xi) SplitAttribute(X) MinEnt- MAX For all attributes i = 1, ..., d %3D If æ; is discrete with n values Split X into X1, . . . , Xn by æ¿ e - SplitEntropy(X1, ..., Xn) /* eq. 9.8 */ If e<MinEnt MinEnt e; bestf + - i Else /* æ; is numeric */ For all possible splits Split X into X1, X2 on æ¡ e-SplitEntropy(X1, X2) If e<MinEnt MinEnt + e; bestf i Return bestf
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 2 steps

Blurred answer
Recommended textbooks for you
Computer Networking: A Top-Down Approach (7th Edi…
Computer Networking: A Top-Down Approach (7th Edi…
Computer Engineering
ISBN:
9780133594140
Author:
James Kurose, Keith Ross
Publisher:
PEARSON
Computer Organization and Design MIPS Edition, Fi…
Computer Organization and Design MIPS Edition, Fi…
Computer Engineering
ISBN:
9780124077263
Author:
David A. Patterson, John L. Hennessy
Publisher:
Elsevier Science
Network+ Guide to Networks (MindTap Course List)
Network+ Guide to Networks (MindTap Course List)
Computer Engineering
ISBN:
9781337569330
Author:
Jill West, Tamara Dean, Jean Andrews
Publisher:
Cengage Learning
Concepts of Database Management
Concepts of Database Management
Computer Engineering
ISBN:
9781337093422
Author:
Joy L. Starks, Philip J. Pratt, Mary Z. Last
Publisher:
Cengage Learning
Prelude to Programming
Prelude to Programming
Computer Engineering
ISBN:
9780133750423
Author:
VENIT, Stewart
Publisher:
Pearson Education
Sc Business Data Communications and Networking, T…
Sc Business Data Communications and Networking, T…
Computer Engineering
ISBN:
9781119368830
Author:
FITZGERALD
Publisher:
WILEY