1. What is your guess on the value of p? 2. In Maximum Likelihood Estimation, we want to find a parameter p which maximizes all the observations in the dataset. If the dataset is a matrix A, where each row a₁, a2,,am are individual observations, we want to maximize P(A) = P(a₁)P(a₂)… P(am) because individ- ual experiments are independent. Maximizing this is equivalent to maximizing log P(A) = log P(a₁)+log P(a₂)++log P(am). Maximizing this quantity is equivalent to minimizing the - log P(A) = -log P(a₁) – log P(a₂) ‒‒log P(am). 3. Here you need to find out P(ai) for yourself. 4. If you can do that properly, you will find an equation of the form: log p Now, define q = mn Ii -log P(A) mn Σi mn -log P(A) mn Σiyi mn Then the equation becomes: - log (1 − p) =-q log p - (1-q) log (1 - p) Use Pinsker's Inequality or Calculus to show that, p = q. 5. What is the value of p for the above dataset given in the table? 6. If you toss 20 coins now, how many coins are most likely to yield a head?

A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
icon
Related questions
Question

Please give a step-by-step solution to parts 1, 2, and 3.

This problem is on Maximum Likelihood Estimation

 

 

 

1st to 2nd to 3rd to 4th to 5th to 6th to 7th toss
T
H
H
H
H
HT
H
T H
H
T T
H
H
T
H
T H
T
HH
T T
TT H
HT T T
H
H
T
T
HT
T
T
T
T
T
H
T
T
T T H H T T
Transcribed Image Text:1st to 2nd to 3rd to 4th to 5th to 6th to 7th toss T H H H H HT H T H H T T H H T H T H T HH T T TT H HT T T H H T T HT T T T T T H T T T T H H T T
1. What is your guess on the value of p?
2. In Maximum Likelihood Estimation, we want to find a parameter p which maximizes all the
observations in the dataset. If the dataset is a matrix A, where each row a1, a2,, am are
individual observations, we want to maximize P(A) = P(a₁) P(a₂) P(am) because individ-
ual experiments are independent. Maximizing this is equivalent to maximizing log P(A) =
log P(a₁) +log P(a₂)++log P(am). Maximizing this quantity is equivalent to minimizing the
-log P(A) = - log P(a₁) – log P(a₂) — · · · – log P(am).
3. Here you need to find out P(a) for yourself.
4. If you can do that properly, you will find an equation of the form:
Now, define q =
m
Σi
mn
-log P(A) ==
mn
Sm
i=1 ilog pi=1 log (1 − p)
mn
Then the equation becomes:
log P(A)
mn
mn
= -q logp (1-q) log (1 - p)
Use Pinsker's Inequality or Calculus to show that, p = q.
5. What is the value of p for the above dataset given in the table?
6. If you toss 20 coins now, how many coins are most likely to yield a head?
Transcribed Image Text:1. What is your guess on the value of p? 2. In Maximum Likelihood Estimation, we want to find a parameter p which maximizes all the observations in the dataset. If the dataset is a matrix A, where each row a1, a2,, am are individual observations, we want to maximize P(A) = P(a₁) P(a₂) P(am) because individ- ual experiments are independent. Maximizing this is equivalent to maximizing log P(A) = log P(a₁) +log P(a₂)++log P(am). Maximizing this quantity is equivalent to minimizing the -log P(A) = - log P(a₁) – log P(a₂) — · · · – log P(am). 3. Here you need to find out P(a) for yourself. 4. If you can do that properly, you will find an equation of the form: Now, define q = m Σi mn -log P(A) == mn Sm i=1 ilog pi=1 log (1 − p) mn Then the equation becomes: log P(A) mn mn = -q logp (1-q) log (1 - p) Use Pinsker's Inequality or Calculus to show that, p = q. 5. What is the value of p for the above dataset given in the table? 6. If you toss 20 coins now, how many coins are most likely to yield a head?
Expert Solution
steps

Step by step

Solved in 4 steps with 8 images

Blurred answer
Similar questions
Recommended textbooks for you
A First Course in Probability (10th Edition)
A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON
A First Course in Probability
A First Course in Probability
Probability
ISBN:
9780321794772
Author:
Sheldon Ross
Publisher:
PEARSON