Imagine we are training a decision tree, and we are at a node. Each data point is (X1, X2,X3,Y), where X1,X2, and X3 are independent variables, and Y is the dependent variable. The data are shown below. Let us train a decision tree with this data. Let's call this tree T1. What feature will we split on at the root? X1 X2 X3 Y 1 1 1 1 1 1 1 X1 X2 X1 or X2 X3
Imagine we are training a decision tree, and we are at a node. Each data point is (X1, X2,X3,Y), where X1,X2, and X3 are independent variables, and Y is the dependent variable. The data are shown below. Let us train a decision tree with this data. Let's call this tree T1. What feature will we split on at the root? X1 X2 X3 Y 1 1 1 1 1 1 1 X1 X2 X1 or X2 X3
A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
Related questions
Question
Expert Solution
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
This is a popular solution!
Trending now
This is a popular solution!
Step by step
Solved in 2 steps
Knowledge Booster
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, probability and related others by exploring similar questions and additional content below.Recommended textbooks for you
A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON
A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON