E To estimate the expected time to failure one could just operate N devices until all of them fail, and then estimate the expected value with the sample mean of the failure times. Sounds elementary. But there is a problem: it could be a very long time until all N devices fail, and its impractical to run a test for that long. In practice, life testing is often terminated (truncated) at some fixed time Tg. When the test ends, n devices have failed with failure times t1, 12, 13, , tn, and N n devices are still operating successfully. ... - How does one use this information to estimate the expected time to failure? Recall the following result: Theorem 1. If the time between independent events is exponentially distributed with common parameter A, then the number of events in an interval of fired length T is Poisson with parameter XT. Where we also recall:

A First Course in Probability (10th Edition)
10th Edition
ISBN:9780134753119
Author:Sheldon Ross
Publisher:Sheldon Ross
Chapter1: Combinatorial Analysis
Section: Chapter Questions
Problem 1.1P: a. How many different 7-place license plates are possible if the first 2 places are for letters and...
icon
Related questions
Question
4 &5
To estimate the expected time to failure one could just operate N devices until
all of them fail, and then estimate the expected value with the sample mean of
the failure times. Sounds elementary. But there is a problem: it could be a very
long time until all N devices fail, and its impractical to run a test for that long.
In practice, life testing is often terminated (truncated) at some fixed time Tg.
When the test ends, n devices have failed with failure times t1, 12, 13,..., tn,
and N-n devices are still operating successfully.
How does one use this information to estimate the expected time to failure?
Recall the following result:
Theorem 1. If the time between independent events is exponentially distributed
with common parameter A, then the number of events in an interval of fired
length T is Poisson with parameter = XT.
Where we also recall:
Definition 1. A Poisson random variable N has probability mass function:
P(N = n) =
e-μ
n!
; n = 0, 1, 2, ..
(7)
where >0 is a constant.
Now in our truncated testing scenario, for n devices we had one failure in the
interval [0, ti], and for N-n devices 0 failures in the interval [0, TE]. Using
Equation) and the independence of the failure times, the probability of ob-
serving the data we have is:
L(X) =
(8)
Ate (e XT)N " =
(9)
i=1
X(N_n)TE
(It.) "e
ΑΣ" 14
e
(10)
i=1
Reasoning that a good estimate of A is one for which the probability, L(A), is
as large as possible, we look for the maximum likelihood estimator of A.
4
Page
4
Transcribed Image Text:To estimate the expected time to failure one could just operate N devices until all of them fail, and then estimate the expected value with the sample mean of the failure times. Sounds elementary. But there is a problem: it could be a very long time until all N devices fail, and its impractical to run a test for that long. In practice, life testing is often terminated (truncated) at some fixed time Tg. When the test ends, n devices have failed with failure times t1, 12, 13,..., tn, and N-n devices are still operating successfully. How does one use this information to estimate the expected time to failure? Recall the following result: Theorem 1. If the time between independent events is exponentially distributed with common parameter A, then the number of events in an interval of fired length T is Poisson with parameter = XT. Where we also recall: Definition 1. A Poisson random variable N has probability mass function: P(N = n) = e-μ n! ; n = 0, 1, 2, .. (7) where >0 is a constant. Now in our truncated testing scenario, for n devices we had one failure in the interval [0, ti], and for N-n devices 0 failures in the interval [0, TE]. Using Equation) and the independence of the failure times, the probability of ob- serving the data we have is: L(X) = (8) Ate (e XT)N " = (9) i=1 X(N_n)TE (It.) "e ΑΣ" 14 e (10) i=1 Reasoning that a good estimate of A is one for which the probability, L(A), is as large as possible, we look for the maximum likelihood estimator of A. 4 Page 4
Exercise 4. Show that the marimum likelihood estimator of X is
n
X =
ti+(N-n)TE
(11)
Possibly helpful suggestion: L(X) is maximal when log(L(A)) is maximal.
Then the maximum likelihood estimator for mean time between failures is:
ti + (N − n)TE
μ=
(12)
n
Here's another, possibly more intuitive way to deduce the estimator of mean
time to failure given by Equation 12). If we knew all the failure times the
natural estimate of mean time to failure would be:
Cli
At=
(13)
N
But we don't have the t; for i>n. But why not replace them with an estimate
of their time to failure? By the forgetfulness property, ETT > TE] = TE+f.
We don't know, but we can replace the ti's in Equation (13) with TE+;
doing so one obtains:
ti+(N-n) (TE +μ)
(14)
μ=
N
Exercise 5. Solve Equation ) for it, and show the result is the estimator
given by Equation (12).
3
Part 3
Every major topic in this course could be a course, or several courses, in itself
For example, in our discussion of random walks we
og For the
Transcribed Image Text:Exercise 4. Show that the marimum likelihood estimator of X is n X = ti+(N-n)TE (11) Possibly helpful suggestion: L(X) is maximal when log(L(A)) is maximal. Then the maximum likelihood estimator for mean time between failures is: ti + (N − n)TE μ= (12) n Here's another, possibly more intuitive way to deduce the estimator of mean time to failure given by Equation 12). If we knew all the failure times the natural estimate of mean time to failure would be: Cli At= (13) N But we don't have the t; for i>n. But why not replace them with an estimate of their time to failure? By the forgetfulness property, ETT > TE] = TE+f. We don't know, but we can replace the ti's in Equation (13) with TE+; doing so one obtains: ti+(N-n) (TE +μ) (14) μ= N Exercise 5. Solve Equation ) for it, and show the result is the estimator given by Equation (12). 3 Part 3 Every major topic in this course could be a course, or several courses, in itself For example, in our discussion of random walks we og For the
Expert Solution
steps

Step by step

Solved in 4 steps with 3 images

Blurred answer
Recommended textbooks for you
A First Course in Probability (10th Edition)
A First Course in Probability (10th Edition)
Probability
ISBN:
9780134753119
Author:
Sheldon Ross
Publisher:
PEARSON
A First Course in Probability
A First Course in Probability
Probability
ISBN:
9780321794772
Author:
Sheldon Ross
Publisher:
PEARSON