IDS 575 PS3

pdf

School

University of Illinois, Chicago *

*We aren’t endorsed by this school

Course

575

Subject

Statistics

Date

Apr 3, 2024

Type

pdf

Pages

12

Uploaded by ChancellorFogGull33

Report
Graded Problem Set (PS) #03 Student Ayushi Rajive Srivastava Total Points 100 / 100 pts Question 1 Maximum Likelihood Basic 25 / 25 pts 1.1 (no title) 5 / 5 pts + 0 pts Incorrect + 5 pts Correct 1.2 (no title) 8 / 8 pts + 8 pts Correct + 0 pts Incorrect 1.3 (no title) 7 / 7 pts + 7 pts Correct + 0 pts Incorrect 1.4 (no title) 5 / 5 pts + 5 pts Correct + 0 pts Incorrect
Question 2 Maximum Likelihood Estimation 30 / 30 pts 2.1 (no title) 10 / 10 pts + 10 pts Correct + 0 pts Incorrect + 8 pts partial mistake 2.2 (no title) 10 / 10 pts + 10 pts Correct + 0 pts Incorrect + 5 pts half correct − 1 pt Click here to replace this description. + 8 pts Click here to replace this description. 2.3 (no title) 10 / 10 pts + 10 pts Correct + 0 pts Incorrect + 5 pts half correct − 1 pt minor mistake
Question 3 Logistic Regression 45 / 45 pts 3.1 (no title) 7 / 7 pts + 7 pts Correct + 0 pts Incorrect 3.2 (no title) 7 / 7 pts + 7 pts Correct + 0 pts Incorrect + 1.5 pts half 3.3 (no title) 5 / 5 pts + 5 pts Correct + 0 pts Incorrect 3.4 (no title) 6 / 6 pts + 6 pts Correct + 0 pts Incorrect 3.5 (no title) 10 / 10 pts + 10 pts Correct + 0 pts Incorrect 3.6 (no title) 10 / 10 pts + 10 pts Correct + 0 pts Incorrect
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
Q1 Maximum Likelihood Basic 25 Points Q1.1 5 Points A coin is tossed 100 times and lands heads 82 times. Choose every correct option in the following. (select all applied, no partial credits for more or less) Q1.2 8 Points In the setting given at Q1.1, what is the probability of head that makes your overall observation most-likely? (autograded short answer: only the final result number, round to 2 decimal places,like 0.55) 0.82 This is a fair coin. Probability of the overall observation is . 0.5 82 Maximum possible likelihood of the overall observation is . 100 82 Minimum possible likelihood of the overall observation is . 100 18 None of the above
Q1.3 7 Points Assume that the coin used for Q1.1 turns out to be completely normal, which is supposed to generate almost equal numbers of heads and tails if randomly tossing many times. Choose every possible concern of using Maximum Likelihood Estimation. (select all applied, no partial credits for more or less) Q1.4 5 Points Maximum Likelihood Estimation gives us a distribution over the parameters as well as the best parameter itself. In other words, MLE provides not only the best parameter but also other parameters with the associated uncertainty of being "non-best". Does not fully use our observation. Does not incorporate our prior knowledge. Does fit tightly to the given observation. Does fit loosely to the given observation. None of the above. θ ^ θ p ( θ ) True False
Q2 Maximum Likelihood Estimation 30 Points Consider the following density function , ; , . This is a legal probability density (parametrized by ) because one can verify that the integral over is equal to 1. If necessary, mean and variance of this distributions can be verifed as and , respectively. f ( x θ ) = θ xe 2 θx x ≥ 0 f ( x θ ) = 0 x < 0 f θ R x ∈ [−∞,∞] 2/ θ 2/ θ 2
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
Q2.1 10 Points Derive a likelihood of a dataset that consists of independent samples from this distribution. (Hint: The likelihood must be a function of the parameters ) MLE Q2.1.pdf Download Your browser does not support PDF previews. You can download the file instead. D = { x , x ,..., x } (1) (2) ( m ) m L ( θ ; D ) θ
Q2.2 10 Points Derive the log-likelihood function of the dataset from Q2.1. (Hint: The log- likelihood ) MLE Q2.2.pdf Download Your browser does not support PDF previews. You can download the file instead. D l ( θ ; D ) = log L ( θ ; D )
Q2.3 10 Points Find the Maximum Likelihood Estimator . (Hint: Set the derivative equal to zero, and then solve the formula) MLE Q2.3.pdf Download Your browser does not support PDF previews. You can download the file instead. θ ^
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
Q3 Logistic Regression 45 Points Q3.1 7 Points Suppose that you have trained a logistic regression classifier, and it outputs a prediction on a new example . Choose every correct interpretation in the following. (select all applied, no partial credits for more or less) Q3.2 7 Points Which of the following ways can we train a logistic regression model? (select all correct, no partial credits for more or less) h ( x ) = θ 0.2 x Our estimate for is 0.2. P ( y = 0∣ x ; θ ) Our estimate for is 0.2. P ( y = 1∣ x ; θ ) Our estimate for is 0.8. P ( y = 0∣ x ; θ ) Our estimate for is 0.8. P ( y = 1∣ x ; θ ) Minimize least-square error Maximize likelihood Solve normal equation Minimimize negative log-likelihood
Q3.3 5 Points In logistic regression, what do we estimate for one each unit’s change in ? Q3.4 6 Points Choose every option that correctly describes properties of the logistic function. (select all applied, no partial credits for more or less) X The change in multiplied with . Y Y The change in from its mean. Y How much changes. Y How much the natural logarithm of the odds (i.e., ) changes. log p ( y =0) p ( y =1) It maps a real-valued confidence value into a probability value. It is essentially an identity function between . [0,1] It always maps zero confidence exactly into the probability 0.5. It is a continuous function that is differentiable everywhere. Its derivative can be easily evaluated with itself.
Q3.5 10 Points Recall the grading problem to predict pass/fail in the class. Suppose you collect data for a group of students in the class that consist of two input features = hours studied and = undergrad GPA. Your goal is to predict the output {pass, fail}. Suppose that you fit a logistic regression, learning its parameter . What will be the probability for a student who studies for 40 hours and has a GPA of 3.5 to pass the class? (augraded short answer: only final result number, round to 2 decimal places, like 0.11) 0.38 Q3.6 10 Points How many hours would the student in part Q3.5 need to study in order to have at least 50% chance of passing the class? (autograded short answer: only final result number: an integer) 50 X 1 X 2 Y ( θ ; θ ; θ ) = 0 1 2 (−6;0.05;1)
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help