1. What is your guess on the value of p? 2. In Maximum Likelihood Estimation, we want to find a parameter p which maximizes all the observations in the dataset. If the dataset is a matrix A, where each row a₁, a2,,am are individual observations, we want to maximize P(A) = P(a₁)P(a₂)… P(am) because individ- ual experiments are independent. Maximizing this is equivalent to maximizing log P(A) = log P(a₁)+log P(a₂)++log P(am). Maximizing this quantity is equivalent to minimizing the - log P(A) = -log P(a₁) – log P(a₂) ‒‒log P(am). 3. Here you need to find out P(ai) for yourself. 4. If you can do that properly, you will find an equation of the form: log p Now, define q = mn Ii -log P(A) mn Σi mn -log P(A) mn Σiyi mn Then the equation becomes: - log (1 − p) =-q log p - (1-q) log (1 - p) Use Pinsker's Inequality or Calculus to show that, p = q. 5. What is the value of p for the above dataset given in the table? 6. If you toss 20 coins now, how many coins are most likely to yield a head?
1. What is your guess on the value of p? 2. In Maximum Likelihood Estimation, we want to find a parameter p which maximizes all the observations in the dataset. If the dataset is a matrix A, where each row a₁, a2,,am are individual observations, we want to maximize P(A) = P(a₁)P(a₂)… P(am) because individ- ual experiments are independent. Maximizing this is equivalent to maximizing log P(A) = log P(a₁)+log P(a₂)++log P(am). Maximizing this quantity is equivalent to minimizing the - log P(A) = -log P(a₁) – log P(a₂) ‒‒log P(am). 3. Here you need to find out P(ai) for yourself. 4. If you can do that properly, you will find an equation of the form: log p Now, define q = mn Ii -log P(A) mn Σi mn -log P(A) mn Σiyi mn Then the equation becomes: - log (1 − p) =-q log p - (1-q) log (1 - p) Use Pinsker's Inequality or Calculus to show that, p = q. 5. What is the value of p for the above dataset given in the table? 6. If you toss 20 coins now, how many coins are most likely to yield a head?
A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
Related questions
Question
Please give a step-by-step solution to parts 1, 2, and 3.
This problem is on Maximum Likelihood Estimation
Expert Solution
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
Step by step
Solved in 4 steps with 8 images
Recommended textbooks for you
A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON
A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON