Let X, Y be two Bernoulli random variables and denote by p = P (X = 1), q = P (Y = 1) and r = P (X = 1, Y = 1). Prove that X and Y are independent if and only if r = pq.
Let X, Y be two Bernoulli random variables and denote by p = P (X = 1), q = P (Y = 1) and r = P (X = 1, Y = 1). Prove that X and Y are independent if and only if r = pq.
Let {Xi, Yi}ni=1 be a sample of n i.i.d. copies of (X, Y ). Based on this sample, we want to test whether X and Y are independent, i.e., whether r = pq
Define ˆp = sum(Yi)/n, ˆq =sum(Yi)/n and ˆr =sum(XiYi)/n
- Prove that these are, respectively, consistent estimators of p, q
and r.
– Show that the
– Using the previous question combined with the Delta-method, prove that
√n ((ˆr − pˆqˆ) − (r − pq)) → t, V
as n → ∞ in distribution, where V depends on p, q and r.
– Consider the following hypotheses:
H0: “X and Y are independent” vs H1: “X and Y are not independent”
Assuming that H0 is true, show that V = pq (1 − p) (1 − q) and propose a consistent estimator of V .
– Using the last two questions, propose a test with asymptotic level α ∈ (0,1).
Trending now
This is a popular solution!
Step by step
Solved in 2 steps