KNN, decision tree: (Machine learning) Please ans question B and C specifically-  6. Consider a binary classification problem using 1-nearest neighbors with the Euclidean distance metric. We have N 1-dimensional training points x(1), x(2), . . . x(N ) and corresponding labels y(1), y(2), . . . y(N ) with x(i ) ∈ R and y(i ) ∈ {0, 1}. Assume the points x(1), x(2), . . . x(N ) are in ascending order by value. If there are ties during the 1-NN algorithm, we break ties by choosing the label corresponding to the x(i ) with lower value.  (a) Is it possible to build a decision tree that behaves exactly the same as the

Database System Concepts
7th Edition
ISBN:9780078022159
Author:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Chapter1: Introduction
Section: Chapter Questions
Problem 1PE
icon
Related questions
Question

KNN, decision tree: (Machine learning)

Please ans question B and C specifically-

 6. Consider a binary classification problem using 1-nearest neighbors with the Euclidean distance metric. We have N 1-dimensional training points x(1), x(2), . . . x(N ) and corresponding labels
y(1), y(2), . . . y(N ) with x(i ) ∈ R and y(i ) ∈ {0, 1}. Assume the points x(1), x(2), . . . x(N ) are in ascending order by value. If there are ties during the 1-NN algorithm, we break ties by choosing the label
corresponding to the x(i ) with lower value. 

(a) Is it possible to build a decision tree that behaves exactly the same as the
1-nearest neighbor classifier? Assume that the decision at each node takes the form of “ x ≤ t or

x > t,” where t ∈R.
◯ Yes
◯ No
If your answer is yes, please explain how you will construct the decision tree. If your answer is no,
explain why it’s not possible.

 (b) Let’s add a dimension! Now assume the training points are 2-dimensional
where x(i ) = ( x( 1i), x( 2i )) ∈R2 and the decision at each node takes the form of “ xj ≤ t or xj > t,”
where t ∈ R and j ∈ {1, 2}. Give an example with at most 3 training points for which it isn’t
possible to build a decision tree that behaves exactly the same as a 1-nearest neighbor classifier. 

 (c) Assuming we have 2-dimensional training points x(i ) = ( x( 1i ), x( 2i )) ∈R2 and
the decision at each node takes the form of “ xj ≤t or xj > t,” where t ∈ R and j ∈ {1, 2}, under
what conditions is it possible to build a decision tree that behaves exactly the same as a 1-nearest
neighbor classifier? Explain why it is possible to build the decision tree as stated in part (a) but, in
general, not possible in part (b). 

Decision tree, KNN:
Consider a binary classification problem using 1-nearest neighbors with the Euclidean dis-
and corresponding labels
6.
(2).x2). .
tance metric. We have N 1-dimensional training points x
y4, y2, ...) with xº) e R and y) e (0, 1. Assume the points x(4, x2), .
ing order by value. If there are ties during the 1-NN algorithm, we break ties by choosing the label
corresponding to the x" with lower value.
....
Is it possible to build a decision tree that behaves exactly the same as the
(a)
1-nearest neighbor classifier? Assume that the decision at each node takes the form of“ x st or
x > t," where t ER.
Yes
No
If your answer is yes, please explain how you will construct the decision tree. If your answer is no,
explain why it's not possible.
(b)
where x" = *", x") ER? and the decision at each node takes the form of “ X¡ st or Xj > t,"
where t eR and j e (1, 2. Give an example with at most 3 training points for which it isn't
possible to build a decision tree that behaves exactly the same as a 1-nearest neighbor classifier.
Let's add a dimension!
Now assume the training points are 2-dimensional
x" = *, x2) eR?:
_i)
(c)
the decision at each node takes the form of “ X¡ <t or Xj > t," where t eR andj e 1, 2, under
what conditions is it possible to build a decision tree that behaves exactly the same as a 1-nearest
neighbor classifier? Explain why it is possible to build the decision tree as stated in part (a) but, in
general, not possible in part (b).
Assuming we have 2-dimensional training points
Transcribed Image Text:Decision tree, KNN: Consider a binary classification problem using 1-nearest neighbors with the Euclidean dis- and corresponding labels 6. (2).x2). . tance metric. We have N 1-dimensional training points x y4, y2, ...) with xº) e R and y) e (0, 1. Assume the points x(4, x2), . ing order by value. If there are ties during the 1-NN algorithm, we break ties by choosing the label corresponding to the x" with lower value. .... Is it possible to build a decision tree that behaves exactly the same as the (a) 1-nearest neighbor classifier? Assume that the decision at each node takes the form of“ x st or x > t," where t ER. Yes No If your answer is yes, please explain how you will construct the decision tree. If your answer is no, explain why it's not possible. (b) where x" = *", x") ER? and the decision at each node takes the form of “ X¡ st or Xj > t," where t eR and j e (1, 2. Give an example with at most 3 training points for which it isn't possible to build a decision tree that behaves exactly the same as a 1-nearest neighbor classifier. Let's add a dimension! Now assume the training points are 2-dimensional x" = *, x2) eR?: _i) (c) the decision at each node takes the form of “ X¡ <t or Xj > t," where t eR andj e 1, 2, under what conditions is it possible to build a decision tree that behaves exactly the same as a 1-nearest neighbor classifier? Explain why it is possible to build the decision tree as stated in part (a) but, in general, not possible in part (b). Assuming we have 2-dimensional training points
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 2 steps with 1 images

Blurred answer
Knowledge Booster
Problems on Dynamic Programming
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, computer-science and related others by exploring similar questions and additional content below.
Similar questions
Recommended textbooks for you
Database System Concepts
Database System Concepts
Computer Science
ISBN:
9780078022159
Author:
Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:
McGraw-Hill Education
Starting Out with Python (4th Edition)
Starting Out with Python (4th Edition)
Computer Science
ISBN:
9780134444321
Author:
Tony Gaddis
Publisher:
PEARSON
Digital Fundamentals (11th Edition)
Digital Fundamentals (11th Edition)
Computer Science
ISBN:
9780132737968
Author:
Thomas L. Floyd
Publisher:
PEARSON
C How to Program (8th Edition)
C How to Program (8th Edition)
Computer Science
ISBN:
9780133976892
Author:
Paul J. Deitel, Harvey Deitel
Publisher:
PEARSON
Database Systems: Design, Implementation, & Manag…
Database Systems: Design, Implementation, & Manag…
Computer Science
ISBN:
9781337627900
Author:
Carlos Coronel, Steven Morris
Publisher:
Cengage Learning
Programmable Logic Controllers
Programmable Logic Controllers
Computer Science
ISBN:
9780073373843
Author:
Frank D. Petruzella
Publisher:
McGraw-Hill Education