공분산이 0과 같은가 이항 랜덤 변수에 대한 독립성을 의미합니까?


14

만약 XY 있습니다 나는 것을 보여줄 수있는 방법 만이 개 가능한 상태를 취할 수있는 두 개의 확률 변수, Cov(X,Y)=0 독립을 의미? Cov(X,Y)=0 이 독립성을 의미하지 않는 날에 배운 것에 반하는 이런 종류의 ...

힌트는 가능한 상태로 10 으로 시작하여 거기에서 일반화한다고 말합니다. 그리고 나는 그것을 할 수 있고 보여줄 수 E(XY)=E(X)E(Y)있지만 이것은 독립성을 의미하지는 않습니까 ???

수학적 으로이 작업을 수행하는 방법을 혼란스럽게 생각합니다.


질문의 제목에서 알 수 있듯이 일반적으로 사실은 아닙니다 ..
Michael R. Chernick

5
증명하려는 진술은 사실입니다. XY 가 Bernoulli 랜덤 변수에 각각 p1 매개 변수가 있으면 E [ X ] = p 1E [ Y ] = p 2 입니다. 따라서 cov ( X , Y ) = E [ X Y ] E [ X ] E [ Y ]는 0과 같습니다.p2E[X]=p1E[Y]=p2cov(X,Y)=E[XY]E[X]E[Y]0에만 E[XY]=P{X=1,Y=1} 과 같다 p1p2=P{X=1}P{Y=1} 보여주는 {X=1}{Y=1} 이다에게 독립적 인 사건 . 그것은 표준 결과입니다 경우 및 BAB독립적 인 이벤트 쌍은 다음 너무되어 , B의 C , 및 경우 → C , B , 및 경우 → C , B C 독립 사건, 즉 XY는 독립적 인 랜덤 변수이다. 이제 일반화하십시오. A,BcAc,BAc,BcXY
Dilip Sarwate

답변:


23

이진 변수의 경우 예상 값은 1과 같을 확률과 같습니다. 따라서,

E(XY)=P(XY=1)=P(X=1Y=1)E(X)=P(X=1)E(Y)=P(Y=1)

두 제로 공분산 수단이있는 경우 수단E(XY)=E(X)E(Y)

P(X=1Y=1)=P(X=1)P(Y=1)

독립 사건에 대한 기본 규칙을 사용하여 (예 : B 가 독립하고 그 상보가 독립 등) 다른 모든 공동 확률도 곱하는 것을 보는 것은 사소한 일입니다. 두 개의 임의 변수 중 독립적입니다.AB


2
Concise and elegant. Classy! +1 =D
Marcelo Ventura

9

Both correlation and covariance measure linear association between two given variables and it has no obligation to detect any other form of association else.

So those two variables might be associated in several other non-linear ways and covariance (and, therefore, correlation) could not distinguish from independent case.

As a very didactic, artificial and non realistic example, one can consider 엑스 such that (엑스=엑스)=1/ for x=1,0,1 and also consider Y=X2. Notice that they are not only associated, but one is a function of the other. Nonetheless, their covariance is 0, for their association is orthogonal to the association that covariance can detect.

EDIT

Indeed, as indicated by @whuber, the above original answer was actually a comment on how the assertion is not universally true if both variables were not necessarily dichotomous. My bad!

So let's math up. (The local equivalent of Barney Stinson's "Suit up!")

Particular Case

두 경우 Y는 이분법 있었다 그럼 모두 값만한다고 가정, 일반성의 손실없이, 취할 수 01 , 임의의 확률로 , P , Q를 그리고 r에 의해 제공된 P ( X = 1 ) = P [ 0 , 1 ] P ( Y = 1 ) = q [ 0 , 1 ] P ( X = 1 , YXY01pqr 완전히 공동 분포를 특성화하는

P(X=1)=p[0,1]P(Y=1)=q[0,1]P(X=1,Y=1)=r[0,1],
Y를 . @DilipSarwate의 힌트를 취하면 P ( X = 0 , Y = 1 ) 이므로 세 값이 ( X , Y ) 의 공동 분포를 결정하기에 충분하다는 점에 유의하십시오.XY(X,Y) (측면에서, 물론rp-r[0,1],q-r[0,1]1-pq-r[
P(X=0,Y=1)=P(Y=1)P(X=1,Y=1)=qrP(X=1,Y=0)=P(X=1)P(X=1,Y=1)=prP(X=0,Y=0)=1P(X=0,Y=1)P(X=1,Y=0)P(X=1,Y=1)=1(qr)(pr)r=1pqr.
rpr[0,1]qr[0,1] 이후 R [ 0 , 1 ] 말하자면, R을 [ 0 , ( P , Q , 1 - P - Q ) ] ).1pqr[0,1]r[0,1]r[0,min(p,q,1pq)]

Notice that r=P(X=1,Y=1) might be equal to the product pq=P(X=1)P(Y=1), which would render X and Y independent, since

P(X=0,Y=0)=1pqpq=(1p)(1q)=P(X=0)P(Y=0)P(X=1,Y=0)=ppq=p(1q)=P(X=1)P(Y=0)P(X=0,Y=1)=qpq=(1p)q=P(X=0)P(Y=1).

Yes, r might be equal to pq, BUT it can be different, as long as it respects the boundaries above.

Well, from the above joint distribution, we would have

E(X)=0P(X=0)+1P(X=1)=P(X=1)=pE(Y)=0P(Y=0)+1P(Y=1)=P(Y=1)=qE(XY)=0P(XY=0)+1P(XY=1)=P(XY=1)=P(X=1,Y=1)=rCov(X,Y)=E(XY)E(X)E(Y)=rpq

Now, notice then that X and Y are independent if and only if Cov(X,Y)=0. Indeed, if X and Y are independent, then P(X=1,Y=1)=P(X=1)P(Y=1), which is to say r=pq. Therefore, Cov(X,Y)=rpq=0; and, on the other hand, if Cov(X,Y)=0, then rpq=0, which is to say r=pq. Therefore, X and Y are independent.

General Case

About the without loss of generality clause above, if X and Y were distributed otherwise, let's say, for a<b and c<d,

P(X=b)=pP(Y=d)=qP(X=b,Y=d)=r
then X and Y given by
X=XabaandY=Ycdc
would be distributed just as characterized above, since
X=aX=0,X=bX=1,Y=cY=0andY=dY=1.
So X and Y are independent if and only if X and Y are independent.

Also, we would have

E(X)=E(Xaba)=E(X)abaE(Y)=E(Ycdc)=E(Y)cdcE(XY)=E(XabaYcdc)=E[(Xa)(Yc)](ba)(dc)=E(XYXcaY+ac)(ba)(dc)=E(XY)cE(X)aE(Y)+ac(ba)(dc)Cov(X,Y)=E(XY)E(X)E(Y)=E(XY)cE(X)aE(Y)+ac(ba)(dc)E(X)abaE(Y)cdc=[E(XY)cE(X)aE(Y)+ac][E(X)a][E(Y)c](ba)(dc)=[E(XY)cE(X)aE(Y)+ac][E(X)E(Y)cE(X)aE(Y)+ac](ba)(dc)=E(XY)E(X)E(Y)(ba)(dc)=1(ba)(dc)Cov(X,Y).
So Cov(X,Y)=0 if and only Cov(X,Y)=0.

=D


1
I recycled that answer from this post.
Marcelo Ventura

Verbatim cut and paste from your other post. Love it. +1
gammer

2
The problem with copy-and-paste is that your answer no longer seems to address the question: it is merely a comment on the question. It would be better, then, to post a comment with a link to your other answer.
whuber

2
How is thus an answer to the question asked?
Dilip Sarwate

1
Your edits still don't answer the question, at least not at the level the question is asked. You write "Notice that r  not necessarily equal to the product pq. That exceptional situation corresponds to the case of independence between X and Y." which is a perfectly true statement but only for the cognoscenti because for the hoi polloi, independence requires not just that
(1)P(X=1,Y=1)=P(X=1)P(Y=1)
but also
(2)P(X=u,Y=v)=P(X=u)P(Y=v), u.v{0,1}.
Yes, (1)(2) as the cognoscenti know; for lesser mortals, a proof that (1)(2) is helpful.
Dilip Sarwate

3

IN GENERAL:

The criterion for independence is F(x,y)=FX(x)FY(y). Or

(1)fX,Y(x,y)=fX(x)fY(y)

"If two variables are independent, their covariance is 0. But, having a covariance of 0 does not imply the variables are independent."

This is nicely explained by Macro here, and in the Wikipedia entry for independence.

independencezero cov, yet

zero covindependence.

Great example: XN(0,1), and Y=X2. Covariance is zero (and E(XY)=0, which is the criterion for orthogonality), yet they are dependent. Credit goes to this post.


IN PARTICULAR (OP problem):

These are Bernoulli rv's, X and Y with probability of success Pr(X=1), and Pr(Y=1).

cov(X,Y)=E[XY]E[X]E[Y]=Pr(X=1Y=1)Pr(X=1)Pr(Y=1)Pr(X=1,Y=1)=Pr(X=1)Pr(Y=1).

This is equivalent to the condition for independence in Eq. (1).


():

E[XY]=domain X, YPr(X=xY=y)xy=0 iff x×y0Pr(X=1Y=1).

(): by LOTUS.


As pointed out below, the argument is incomplete without what Dilip Sarwate had pointed out in his comments shortly after the OP appeared. After searching around, I found this proof of the missing part here:

If events A and B are independent, then events Ac and B are independent, and events Ac and Bc are also independent.

Proof By definition,

A and B are independent P(AB)=P(A)P(B).

But B=(AB)+(AcB), so P(B)=P(AB)+P(AcB), which yields:

P(AcB)=P(B)P(AB)=P(B)P(A)P(B)=P(B)[1P(A)]=P(B)P(Ac).

Repeat the argument for the events Ac and Bc, this time starting from the statement that Ac and B are independent and taking the complement of B.

Similarly. A and Bc are independent events.

So, we have shown already that

Pr(X=1,Y=1)=Pr(X=1)Pr(Y=1)
and the above shows that this implies that
Pr(X=i,Y=j)=Pr(X=i)Pr(Y=j),  i,j{0,1}
that is, the joint pmf factors into the product of marginal pmfs everywhere, not just at (1,1). Hence, uncorrelated Bernoulli random variables X and Y are also independent random variables.

2
Actually that's not an equivalent condition to Eq (1). All you showed was that fX,Y(1,1)=fX(1)fY(1)
gammer

Please consider replacing that image with your own equations, preferably ones that don't use overbars to denote complements. The overbars in the image are very hard to see.
Dilip Sarwate

@DilipSarwate No problem. Is it better, now?
Antoni Parellada

1
Thanks. Also, note that strictly speaking, you also need to show that A and Bc are independent events since the factorization of the joint pdf into the product of the marginal pmts must hold at all four points. Perhaps adding the sentence "Similarly. A and Bc are independent events" right after the proof that Ac and B are independent events will work.
Dilip Sarwate

@DilipSarwate Thank you very much for your help getting it right. The proof as it was before all the editing seemed self-explanatory, because of all the inherent symmetry, but it clearly couldn't be taken for granted. I am very appreciative of your assistance.
Antoni Parellada
당사 사이트를 사용함과 동시에 당사의 쿠키 정책개인정보 보호정책을 읽고 이해하였음을 인정하는 것으로 간주합니다.
Licensed under cc by-sa 3.0 with attribution required.