Quiz 1 for 3-hour students Attempt review Learn@Illinois

htm

School

University of Illinois, Urbana Champaign *

*We aren’t endorsed by this school

Course

440

Subject

Computer Science

Date

Jan 9, 2024

Type

htm

Pages

7

Uploaded by cheng-han

Report
Skip to main content CS 440 R3/R4 FA23: Artificial Intelligence (Fleck, M) Started on Friday, September 8, 2023, 9:26 AM State Finished Information Flag question Information text This is a closed book quiz. No books, no notes, no talking to friends, no looking around on the internet. This quiz must be taken in our lecture hall. Please close all other windows and tabs, so that this quiz window is the only thing on your screen. Cell phones must be stored out of sight. It's ok to use scratch paper, as long as it starts off blank. All answers must be typed into the answer boxes. You cannot upload scans of handwritten work or cut and paste from another window. You have 20 minutes to finish. If you are accidentally disconnected or need to restart your browser because it freezes, try to restart the quiz as soon as possible. Most (perhaps all) of your attempt should have been saved and moodle should give you the rest of your time. For open answer questions, a very short answer (e.g. "no") is not sufficient. Back that up with a brief explanation or justification. Question 1 Not answered Not graded Flag question Question text Please use this box to report technical problems, possible bugs with questions, etc. If you do not have problems to report, do not write anything in this box. It creates extra work for us if you
write answers like "none" into the box. Answer text Question 1 Question 2 Correct 2.00 points out of 2.00 Flag question Question text Which of the following people is a famous researcher in speech understanding? Question 2 Select one: a. Fred Jelinek b. Warren McCulloch c. Noam Chomsky d. Roger Schank Feedback Your answer is correct. The correct answer is: Fred Jelinek Question 3 Correct 2.00 points out of 2.00 Flag question Question text Which of the following is incorrect (as a general formula)? Question 3 Select one:
a. $$P(A,B,C) = P(A ) \cdot P(B | A) \cdot P(C | B)$$ b. $$P(A,B,C) = P(C) \cdot P(B |C) \cdot P(A|B,C)$$ c. $$P(A)\ / \ P(A | C) = P(C) \ / \ P(C | A) $$ Feedback Your answer is correct. The correct answer is: $$P(A,B,C) = P(A ) \cdot P(B | A) \cdot P(C | B)$$ Question 4 Correct 2.00 points out of 2.00 Flag question Question text Why would we add bigram features when implementing Naive Bayes? Question 4 Select one: a. to improve computation speed b. to prevent overfitting (i.e. building a model that's too specific to the training data) c. to improve accuracy d. to avoid underflow Feedback Your answer is correct.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
The correct answer is: to improve accuracy Question 5 Correct 2.00 points out of 2.00 Flag question Question text Naive Bayes makes a statistical assumption about how evidence (e.g. words) is related to class labels. By "assumption," I mean an empirical characterization of the relationship that may be only approximately true. Which of these best categorizes it? Question 5 Select one: a. Uniform prior b. Bayes Theorem c. a Markov assumption d. Independence e. Conditional independence Feedback Your answer is correct. The correct answer is: Conditional independence Question 6 Correct 2.00 points out of 2.00 Flag question
Question text Patrick Pantel and Dekang Lin are known for developing ... Question 6 Answer a. the polytree algorithm b. an early but effective Bag of Words spam classifier c. a tokenization algorithm d. a stemming algorithm e. a simpler derivation of Bayes rule Feedback Your answer is correct. The correct answer is: an early but effective Bag of Words spam classifier Question 7 Complete 4.00 points out of 4.00 Flag question Question text In MAP estimation, our main goal is to find the cause that maximizes $$P(\text{cause} \ \mid \text{evidence})$$. However, the quantity we actually calculate is $$P(\text{evidence} \ \mid \ \text{cause})\cdot P(\text{cause})$$. These two quantities aren't equal. Why is this ok? Answer text Question 7 P(cause | evidence) = P(evidence | cause) * P(cause) / P(evidence) So P(cause | evidence) is proportional to P(evidence | cause) * P(cause)
Since we only care about the relative difference in MAP estimation, we don't need to worry about the value of P(evidence). Feedback The difference between $$P(\text{cause} \ \mid \text{evidence})$$ and $$P(\text{evidence} \ \mid \text{cause})\cdot P(\text{cause})$$ is $$P(\text{evidence})$$. $$(\text{evidence})$$ is the same for all the different causes we are choosing between. Comments Comment: Question 8 Complete 3.00 points out of 4.00 Flag question Question text Suppose that we have a collection of words and wish to estimate the underlying probability of each word, as well as a probability \(\alpha\) for any unseen word (UNK). We could compute this using the following equations, where count(W) is the number of times we've seen word W and n is the total number of words in the collection. P(UNK | C) = \(\alpha \over n\) P(W | C) = \({count(W) + \alpha} \over n\) Explain what's wrong with these equations and show what the correct equations are. Answer text Question 8 By adding α to the probability of each word, the total probability of all words will exceed 1. We can change n to n + α(V+1) so that the sum of all the probability of words will equal 1. i.e. P(UNK | C) = \( \frac{ \alpha }{n + \alpha(V+1) } \) P(W | C) = \( \frac{count(W)+ \alpha }{n+ \alpha(V+1) } \) Feedback These probabilities don't add up to 1. The correct equations are as follows, where V is the number of word types seen in the training data P(UNK | C) = \( \frac {\alpha}{ n + \alpha(V+1)} \) P(W | C) = \( \frac {\text{count}(W) + \alpha}{ n + \alpha(V+1)} \)
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
Comments Comment: does not define V Skip Navigation Navigation Information i This page Question 1 This page Question 2 This page Question 3 This page Question 4 This page Question 5 This page Question 6 This page Question 7 This page Question 8 This page Show one page at a time Save the state of the flags