EE5731_Questions_updated_7-7

pdf

School

National University of Singapore *

*We aren’t endorsed by this school

Course

5731

Subject

Computer Science

Date

Nov 24, 2024

Type

pdf

Pages

1

Uploaded by DukeThunder8068

Report
classifier. This helps the new classifier to focus on the examples that the current ensemble finds difficult. (d) The weights of samples are adjusted in each iteration, influencing the decision boundary for the next weak classifier. Justification: The adjustment of weights is a core part of how AdaBoost operates. By adjusting the weights, the algorithm changes the training focus for the following weak classifiers, thus affecting the decision boundary they produce. Effectiveness of Weak Classifiers What is the role of weak classifiers in the AdaBoost algorithm? (a) Weak classifiers are discarded as they do not contribute to the final model. (b) Weak classifiers must classify all training samples correctly before being included in the final model. (c) Even though they are weak, these classifiers form the building blocks of the strong classifier AdaBoost aims to construct. (d) Weak classifiers are only used to classify easy samples that have been correctly classified in previous rounds. Justify your answer. (c) Even though they are weak, these classifiers form the building blocks of the strong classifier AdaBoost aims to construct. Justification: The term "weak classifiers" refers to classifiers that are only slightly correlated with the true classification (better than random guessing). AdaBoost combines these weak classifiers in a weighted manner to form a strong classifier. Final Hypothesis of AdaBoost What does the final hypothesis of AdaBoost represent? (a) It is a combination of all weak classifiers, regardless of their individual accuracy. (b) It is a single best weak classifier selected from all the rounds. (c) It is a strong classifier obtained by combining the weighted vote of each weak classifier. (d) It is an average of all the decision boundaries created by the weak classifiers. (c) It is a strong classifier obtained by combining the weighted vote of each weak classifier. Justification: The final hypothesis of AdaBoost is a strong classifier that results from the weighted combination of all the weak classifiers trained in each round. The weights reflect how well each weak classifier does on the training set, with more accurate classifiers having more influence on the final outcome. Initialization and Iteration in AdaBoost Select all true statements regarding the initialization and iteration process in AdaBoost. (a) The initial weights are set such that each sample, regardless of its class, has the same weight.
Discover more documents: Sign up today!
Unlock a world of knowledge! Explore tailored content for a richer learning experience. Here's what you'll get:
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help