0이 아닌 상관 관계는 의존성을 의미합니까?


17

우리는 제로 상관 관계가 독립성을 의미하지 않는다는 사실을 알고 있습니다. 0이 아닌 상관 관계가 의존성을 의미하는지 여부에 관심이 있습니다. 즉 , 임의의 변수 XY에 대해 일반적으로 f X , Y ( x , y ) f X ( x ) f Y ( y ) ?Corr(X,Y)0XYfX,Y(x,y)fX(x)fY(y)

답변:


13

예, 왜냐하면

Corr(X,Y)0Cov(X,Y)0

E(XY)E(X)E(Y)0

xyfX,Y(x,y)dxdyxfX(x)dxyfY(y)dy0

xyfX,Y(x,y)dxdyxyfX(x)fY(y)dxdy0

xy[fX,Y(x,y)fX(x)fY(y)]dxdy0

f X , Y ( x , y ) f X ( x ) f Y ( y ) = 0 인 경우 불가능합니다. . 그래서fX,Y(x,y)fX(x)fY(y)=0,{x,y}

Corr(X,Y)0{x,y}:fX,Y(x,y)fX(x)fY(y)

Question: what happens with random variables that have no densities?


1
Alecos, I have a dumb question. What does the fancy arrow mean in, e.g., line 1? I imagine something like "imply," but I'm uncertain.
Sycorax says Reinstate Monica

2
@user777 You mean ? Indeed, it means "implies".
Alecos Papadopoulos

The reason to only use the implication arrow in informal argument: is the implication arrow left or right associative?
kasterma

\implies produces which looks better than \rightarow which produces .
Dilip Sarwate

14

Let X and Y denote random variables such that E[X2] and E[Y2] are finite. Then, E[XY], E[X] and E[Y] all are finite.

Restricting our attention to such random variables, let A denote the statement that X and Y are independent random variables and B the statement that X and Y are uncorrelated random variables, that is, E[XY]=E[X]E[Y]. Then we know that A implies B, that is, independent random variables are uncorrelated random variables. Indeed, one definition of independent random variables is that E[g(X)h(Y)] equals E[g(X)]E[h(Y)] for all measurable functions g() and h()). This is usually expressed as

AB.
But AB is logically equivalent to ¬B¬A, that is,

correlated random variables are dependent random variables.

If E[XY], E[X] or E[Y] are not finite or do not exist, then it is not possible to say whether X and Y are uncorrelated or not in the classical meaning of uncorrelated random variables being those for which E[XY]=E[X]E[Y]. For example, X and Y could be independent Cauchy random variables (for which the mean does not exist). Are they uncorrelated random variables in the classical sense?


3
The nice thing about this answer is that it applies whether or not the random variables in question admit a density function, as opposed to other answers on this thread. This is true due to the fact that expectations can be defined with Stieltjes integrals using the CDF, with no mention of the density.
ahfoss

1

Here a purely logical proof. If AB then necessarily ¬B¬A, as the two are equivalent. Thus if ¬B then ¬A. Now replace A with independence and B with correlation.

Think about a statement "if volcano erupts there are going to be damages". Now think about a case where there are no damages. Clearly a volcano didn't erupt or we would have a condtradicition.

Similarly, think about a case "If independent X,Y, then non-correlated X,Y". Now, consider the case where X,Y are correlated. Clearly they can't be independent, for if they were, they would also be correlated. Thus conclude dependence.


If you will read my answer carefully, you will see that I too used the argument that you have made in your answer, namely that AB is the same as B¬A.
Dilip Sarwate

@DilipSarwate Edited to reflect that.
Tony
당사 사이트를 사용함과 동시에 당사의 쿠키 정책개인정보 보호정책을 읽고 이해하였음을 인정하는 것으로 간주합니다.
Licensed under cc by-sa 3.0 with attribution required.