English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

how would i go about solving this problem?

Suppose that X and Y are random variables with joint density:
fX;Y (x; y) = 1 such that -y < x < y, where y lies in the set (0,1)
0 elsewhere
.
Show that Cov (X; Y ) = 0, but X and Y are not independent.

please could you show me this step by step? Thanks

2006-11-22 10:07:33 · 1 answers · asked by drummanmatthew 2 in Science & Mathematics Mathematics

1 answers

I'm not sure precisely which way you were taught to do this, but heres how I'd do it:
Cov(X,Y) = E[XY]-E[X]E[Y]. Now E[XY] = (integral from y=0 to y=1) of (integral from x=-y to x=y) of xy*1. That inner integral turns into (0.5x^2y) from x=-y to x=y; substitute those in and you get 0, so the whole thing becomes 0. Also, E[X] is obviously 0 (you could do that by integrating again, but its symmetric about the y axis), so the covariance is 0.
However, the variables are definitely not independent - if y<=0 then P(X>0.5)=0, but if y<=1 P(X>0.5) is bigger than 0.

2006-11-22 10:20:28 · answer #1 · answered by stephen m 4 · 0 0

fedest.com, questions and answers