о о о Ô MLE Ô n Σ xi i=1 n MLE = = II xi! i=1 none of the other answers is correct 1 MLE == n ገራ Σ Ꮖ ; Ô О ○ Ô MLE = - по i=1 A discrete random variable X with the range (i.e. the set of all possible values) Rx = {0, 1, 2, 3,... } has probability mass function f(x; 0) = -0.0 e- a! x = 0, 1, 2, 3,... where is an unknown parameter. We want to find maximum likelihood estimate MLE of the parameter based on a sample (x1, x2,...,xn)? To do that, we find the likelihood function n L(0) = L(0; x1, . . ., x n ) = II ƒ (xi ; 0) · = i=1 = i=1 n Oxi e 0x1 xi! e-o 0x2 e-o Өх x2! xn! xi! -по x₁! =en. ΑΣ 12 Then, the log-likelihood function is In L(0) = ln L(0; x1,...,xn) = lne¯no . n II i=1 In en+In+In ( 1 · II 1 xi ! n =-10+(Σ Σπι)· .In In i=1 1 ·II xi! To find Ô MLE (the maximum likelihood estimator of 0), we maximize In L(0) while thinking of it as a function of only, i.e. treating xi's as given constants, and thus, treating the last term In ( 1 I_! as a constant. Eventually, what do we get as the maximum likelihood estimator MLE of the parameter 0?

icon
Related questions
Question
100%
о
о
о
Ô MLE
Ô
n
Σ
xi
i=1
n
MLE = = II xi!
i=1
none of the other answers is correct
1
MLE ==
n
ገራ
Σ
Ꮖ ;
Ô
О
○ Ô MLE
= - по
i=1
Transcribed Image Text:о о о Ô MLE Ô n Σ xi i=1 n MLE = = II xi! i=1 none of the other answers is correct 1 MLE == n ገራ Σ Ꮖ ; Ô О ○ Ô MLE = - по i=1
A discrete random variable X with the range (i.e. the set of all possible values)
Rx = {0, 1, 2, 3,... } has probability mass function f(x; 0)
=
-0.0
e-
a!
x = 0, 1, 2, 3,... where is an unknown parameter. We want to find maximum likelihood
estimate MLE of the parameter based on a sample (x1, x2,...,xn)?
To do that, we find the likelihood function
n
L(0) = L(0; x1, . . ., x n ) = II ƒ (xi ; 0)
· =
i=1
=
i=1
n
Oxi
e
0x1
xi!
e-o
0x2
e-o
Өх
x2!
xn!
xi!
-по
x₁!
=en. ΑΣ 12
Then, the log-likelihood function is
In L(0) = ln L(0; x1,...,xn) = lne¯no
.
n
II
i=1
In en+In+In (
1
· II 1 xi !
n
=-10+(Σ Σπι)·
.In In
i=1
1
·II xi!
To find Ô MLE (the maximum likelihood estimator of 0), we maximize In L(0) while thinking
of it as a function of only, i.e. treating xi's as given constants, and thus, treating the last
term In (
1
I_!
as a constant.
Eventually, what do we get as the maximum likelihood estimator MLE of the parameter 0?
Transcribed Image Text:A discrete random variable X with the range (i.e. the set of all possible values) Rx = {0, 1, 2, 3,... } has probability mass function f(x; 0) = -0.0 e- a! x = 0, 1, 2, 3,... where is an unknown parameter. We want to find maximum likelihood estimate MLE of the parameter based on a sample (x1, x2,...,xn)? To do that, we find the likelihood function n L(0) = L(0; x1, . . ., x n ) = II ƒ (xi ; 0) · = i=1 = i=1 n Oxi e 0x1 xi! e-o 0x2 e-o Өх x2! xn! xi! -по x₁! =en. ΑΣ 12 Then, the log-likelihood function is In L(0) = ln L(0; x1,...,xn) = lne¯no . n II i=1 In en+In+In ( 1 · II 1 xi ! n =-10+(Σ Σπι)· .In In i=1 1 ·II xi! To find Ô MLE (the maximum likelihood estimator of 0), we maximize In L(0) while thinking of it as a function of only, i.e. treating xi's as given constants, and thus, treating the last term In ( 1 I_! as a constant. Eventually, what do we get as the maximum likelihood estimator MLE of the parameter 0?
Expert Solution
steps

Step by step

Solved in 2 steps with 1 images

Blurred answer