Concept explainers
Let X and Y have a bivariate
Compute
(a)
(b)
Want to see the full answer?
Check out a sample textbook solutionChapter 4 Solutions
Probability and Statistical Inference (9th Edition)
- Suppose X and Y are two independent variables with variance 1. Let Z = X+bY where b > 0. If Cor(Z, Y ) = 1/2, what is the value of b?arrow_forwardQ. Let the joint pmf of x and y be defined by f (x,y) = (x+y)/32 where x = 1,2 and y = 1,2,3,4arrow_forward8. Let y be a normal random variable with a constant mean E(y) = 4, constant variance var(y.) = o², and a covariance cov(y, y;) = 0 for t# j. Consider the sample mean j = E %3D A. Show that the variance of the sample mean y is .arrow_forward
- Q1. Let X; and S be the mean and variance of independent random samples of size ni from populations with mean H; and variance of (i = 1,2), respectively where u, = 20. of = 9, n1 = 36 and u2 = 18, ož = 16, n2 = 49. Find (- a) P(X, > 20.98) b) x such that P(X, 12.8062) d) si such that P(S} > s3) = 0.80 e) P(X1 – X2 > 0.2336) n P(SE/S글 > 0.8368) %3Darrow_forwardLet E(X|Y = y) = 3y and var (X|Y = y) = 2, and let Y have the p. d.f. fV) = {e if y > 0 0 otherwise %3D Then the variance of X isarrow_forwardLet X1, . . . , Xn a random sample from a Uniform distribution on the interval [0, 2θ + 1].(a) Find the MLE of θ.(b) Find the MLE for the variance of the distribution of Xi.arrow_forward
- Now, consider an estimator of μ: W=1/16Y1+1/16Y2+1/4Y3+1/8Y4+1/2Y5 This is an example of a weighted average of the Yi’s. Show that W is also an unbiased estimator of μ. Find the variance of W.arrow_forwardConsider an estimator of μ: W= 1/6 Y1+1/16Y2+1/4Y3+1/8Y4+1/2Y5 This is an example of a weighted average of the Yi's. Show that W is also an unbiased estimator of μ. Find the variance of W.arrow_forwardLet x1, x2, and x3 be a random sample from and exponential distribution with mean θ. Suppose that θhat = (x1+x2)/2 is an estimator of θ. 1. Find out if the estimator is biased or unbiased. [Show your solution.] 2. Compute for the variance of the given estimator.arrow_forward
- Let {X1, X2,..., Xn} be a random sample from the normal distribution n(µ, o²). A commonly used unbiased estimator for o? is the sample variance defined by n 1 S? E(X; - X)². 1 i=1 (a) Find the MSE (Mean squared error) of S². (b) Another commonly adopted definition of the sample variance is given by n 1 (X; – X)² = п — 1 SD? n2(X; - i=1 Find the MSE of SD² as an estimator of o? and compare it to S². Which one is better? cS², where c is some positive (c) Suppose that we seek for an estimator that takes the form ô? constant. Find the value of c such that ô² has the smallest MSE among all the estimators in the form of cS?.arrow_forward- (Sec. 6.1) Using a long rod that has length µ (unknown), you are going to lay out a square plot in which the length of each side is µ. Thus the area of the plot will be µ². However, because you do not know the value of µ, you decide to make n independent measurements X1,...,X, of the length. Assume that each X; has mean µ and variance o². (a) Show that X² is not an unbiased estimator for the area of the square plot µ². [Hint: for any rv Y, E[Y²] = V[Y] + E[Y]². Apply this for Y = X.] (b) For what value of k is the estimator X² – kS² unbiased for µ²?arrow_forwardUse what you know about order statistics to show that for the random sample of size n = 3 the median is an unbiased estimator of the parameter θ of a uniform population with α = θ − 1/2 and β = θ + 1/2.arrow_forward
- Glencoe Algebra 1, Student Edition, 9780079039897...AlgebraISBN:9780079039897Author:CarterPublisher:McGraw Hill