PS#3

pdf

School

California Lutheran University *

*We aren’t endorsed by this school

Course

IDS575

Subject

Statistics

Date

Apr 3, 2024

Type

pdf

Pages

9

Uploaded by SuperHumanCrabPerson1153

Report
Q1 Maximum Likelihood Basic 25 Points Q1.1 5 Points A coin is tossed 100 times and lands heads 82 times. Choose every correct option in the following. Q1.2 8 Points In the setting given at Q1.1, what is the probability of head that makes your overall observation most-likely? (short answer: only result number, round to 2 decimal places,like 0.55) 0.82 Q1.3 7 Points Assume that the coin used for Q1.1 turns out to be completely normal, which is supposed to generate almost equal numbers of heads and This is a fair coin. Probability of the overall observation is . 0.5 82 Maximum possible likelihood of the overall observation is . 100 82 Minimum possible likelihood of the overall observation is . 100 18 None of the above
tails if randomly tossing many times. Choose every possible concern of using Maximum Likelihood Estimation. (select all applied) Q1.4 5 Points Maximum Likelihood Estimation gives us a distribution over the parameters as well as the best parameter itself. In other words, MLE provides not only the best parameter but also other parameters with the associated uncertainty of being "non-best". Q2 Maximum Likelihood Estimation 30 Points Consider the following density function , ; , . This is a legal probability density (parametrized by ) because one can verify that the integral over is equal to 1. If necessary, mean and variance of this distributions can be verifed as and , respectively. Q2.1 10 Points Does not fully use our observation. Does not incorporate our prior knowledge. Does fit tightly to the given observation. Does fit loosely to the given observation. None of the above. θ ^ θ p ( θ ) True False f ( x θ ) = θ xe 2 θx x ≥ 0 f ( x θ ) = 0 x < 0 f θ R x ∈ [−∞, ∞] 2/ θ 2/ θ 2 ( ) ( ) ( )
Derive a likelihood of a dataset that consists of independent samples from this distribution. (Hint: The likelihood must be a function of the parameters ) Q2.1.pdf Download 1 of 1 Q2.2 10 Points Derive the log-likelihood function of the dataset from Q2.1. (Hint: The log-likelihood ) D = { x , x , ..., x } (1) (2) ( m ) m L ( θ ; D ) θ D l ( θ ; D ) = log L ( θ ; D )
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
Q2.2.pdf Download 1 of 1 Q2.3 10 Points Find the Maximum Likelihood Estimator . (Hint: Set the derivative equal to zero, and then solve the formula) Q2.3.pdf Download θ ^
1 of 1 Q3 Logistic Regression 45 Points Q3.1 7 Points Suppose that you have trained a logistic regression classifier, and it outputs a prediction on a new example . Choose every correct interpretation in the following. h ( x ) = θ 0.2 x
Q3.2 7 Points Which of the following ways can we train a logistic regression model? (select all correct) Q3.3 5 Points In logistic regression, what do we estimate for one each unit’s change in ? Our estimate for is 0.2. P ( y = 0∣ x ; θ ) Our estimate for is 0.2. P ( y = 1∣ x ; θ ) Our estimate for is 0.8. P ( y = 0∣ x ; θ ) Our estimate for is 0.8. P ( y = 1∣ x ; θ ) Minimize least-square error Maximize likelihood Solve normal equation Minimimize negative log-likelihood X The change in multiplied with . Y Y The change in from its mean. Y How much changes. Y How much the natural logarithm of the odds (i.e., ) changes. log p ( y =0) p ( y =1)
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
Q3.4 6 Points Choose every option that correctly describes properties of the logistic function. Q3.5 10 Points Recall the grading problem to predict pass/fail in the class. Suppose you collect data for a group of students in the class that consist of two input features = hours studied and = undergrad GPA. Your goal is to predict the output {pass, fail}. Suppose that you fit a logistic regression, learning its parameter . What will be the probability for a student who studies for 40 hours and has a GPA of 3.5 to pass the class? (short answer: only result number, round to 2 decimal places, like 0.11) 0.38 Q3.6 10 Points How many hours would the student in part Q3.5 need to study in order to have at least 50% chance of passing the class? (short answer: only result number) It maps a real-valued confidence value into a probability value. It is essentially an identity function between . [0, 1] It always maps zero confidence exactly into the probability 0.5. It is a continuous function that is differentiable everywhere. Its derivative can be easily evaluated with itself. X 1 X 2 Y ( θ ; θ ; θ ) = 0 1 2 (−6; 0.05; 1)
50 GRADED Problem Set (PS) #03 STUDENT Urvashiben Patel TOTAL POINTS 95 / 100 pts QUESTION 1 Maximum Likelihood Basic 20 / 25 pts 1.1 (no title) 0 / 5 pts 1.2 (no title) 8 / 8 pts 1.3 (no title) 7 / 7 pts 1.4 (no title) 5 / 5 pts QUESTION 2 Maximum Likelihood Estimation 30 / 30 pts 2.1 (no title) 10 / 10 pts 2.2 (no title) 10 / 10 pts 2.3 (no title) 10 / 10 pts QUESTION 3 Logistic Regression 45 / 45 pts 3.1 (no title) 7 / 7 pts 3.2 (no title) 7 / 7 pts 3.3 (no title) 5 / 5 pts 3.4 (no title) 6 / 6 pts 3.5 (no title) 10 / 10 pts
3.6 (no title) 10 / 10 pts
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help