변수의 역수의 기대


답변:


27

1 / E (X) 일 수 있습니까?

아니요, 일반적으로는 불가능합니다. 젠슨 부등식 우리에게 그 경우, X 확률 변수이고, φ 볼록 함수 인 다음 φ(E[X])E[φ(X)] . 경우 X 다음 엄격히 양의 1/X , 그래서 볼록 E[1/X]1/E[X] 및 엄격 볼록 함수, 평등에만 발생 X분산이 0입니다 ... 그래서 우리가 관심있는 경향이있는 경우, 두 개는 일반적으로 동일하지 않습니다.

우리가 양의 변수를 다루고 있다고 가정하면, X1/X 가 반비례 관계 ( Cov(X,1/X)0 ) 라는 것이 분명하다면 이것은 E(X1/X)E(X)E(1/X)0 의미 E(X)E(1/X)1따라서 입니다.E(1/X)1/E(X)

분모에 기대를 적용하는 데 혼란 스럽습니다.

무의식 통계학 자의 법칙을 사용하라

E[g(X)]=g(x)fX(x)dx

(연속적인 경우)

따라서 ,E[1g(X)=1XE[1X]=f(x)xdx

어떤 경우에는 검사를 통해 (예 : 감마 랜덤 변수를 사용하여) 또는 역 분포를 유도하거나, 다른 방법으로 기대치를 평가할 수 있습니다.


14

Glen_b가 말했듯이, 역수는 비선형 함수이기 때문에 아마도 잘못된 것입니다. 당신이에 근사 할 경우 E(1/X) 어쩌면 당신은 주위 테일러 전개 사용할 수 있습니다 E(X) :

E(1X)E(1E(X)1E(X)2(XE(X))+1E(X)3(XE(X))2)==1E(X)+1E(X)3Var(X)
그냥 의미와 X의 분산하고, 필요 그래서 분포의 경우X 대칭이 근사는 매우 정확이 될 수 있습니다.

편집 : 아마도 위의 내용은 매우 중요합니다. 아래 BioXX의 의견을 참조하십시오.


oh yes yes...I am very sorry that I could not apprehend that fact...I have one more q...Is this applicable to any kind of function???actually I am stuck with |x|...How can the expectation of |x| can be deduced in terms of E(x) and V(x)
Sandipan Karmakar

2
I don't think you can use it for |X| as that function is not differentiable. I would rather divide the problem into the cases and say E(|X|)=E(X|X>0)p(X>0)+E(X|X<0)p(X<0), I guess.
Matteo Fasiolo

1
@MatteoFasiolo Can you please explain why the symmetry of the distribution of X (or lack thereof) has an effect on the accuracy of the Taylor approximation? Do you have a source that you could point me to that explains why this is?
Aaron Hendrickson

1
@AaronHendrickson my reasoning is simply that the next term in the expansion is proportional to E{(XE(X))3} which is related to the skewness of the distribution of X. Skewness is an asymmetry measure. However, zero skewness does not guarantee symmetry and I am not sure whether symmetry guarantees zero skewness. Hence, this is all heuristic and there might be plenty of counterexamples.
Matteo Fasiolo

4
I don't understand how this solution gets so many upvotes. For a single random variable X there is no justificiation about the quality of this approximation. The third derivative f(x)=1/x is not bounded. Moreover the remainder of the approx. is 1/6f(ξ)(Xμ)3 where ξ is itself a random variable between X and μ. The remainder won't vanish in general and may be very huge. Taylor approx. may only be useful if one has sequence of random variables Xnμ=Op(an) where an0. Even then uniform integrability is needed additionally if interested in the expectation.
BloXX

8

Others have already explained that the answer to the question is NO, except trivial cases. Below we give an approach to finding E1X when X>0 with probability one, and the moment generating function MX(t)=EetX do exist. An application of this method (and a generalization) is given in Expected value of 1/x when x follows a Beta distribution, we will here also give a simpler example.

First, note that 0etxdt=1x (simple calculus exercise). Then, write

E(1X)=0x1f(x)dx=0(0etxdt)f(x)dx=0(0etxf(x)dx)dt=0MX(t)dt
A simple application: Let X have the exponential distribution with rate 1, that is, with density ex,x>0 and moment generating function MX(t)=11t,t<1. Then 0MX(t)dt=011+tdt=ln(1+t)|0=, so definitely do not converge, and is very different from 1EX=11=1.

7

An alternative approach to calculating E(1/X) knowing X is a positive random variable is through its moment generating function E[eλX]. Since by elementary calculas

0eλxdλ=1x
we have, by Fubini's theorem
0E[eλX]dλ=E[1X].

2
The idea here is right, but the details wrong. Pleasecheck
kjetil b halvorsen

1
@Kjetil I don't see what the problem is: apart from the inconsequential differences of using tX instead of tX in the definition of the MGF and naming the variable t instead of λ, the answer you just posted is identical to this one.
whuber

1
You are right, the problems was less than I thought. Still this answer would be better withm some more details. I will upvote this tomorrow ( when I have new votes)
kjetil b halvorsen

1

To first give an intuition, what about using the discrete case in finite sample to illustrate that E(1/X)1/E(X) (putting aside cases such as E(X)=0)?

In finite sample, using the term average for expectation is not that abusive, thus if one has on the one hand

E(X)=1Ni=1NXi

and one has on the other hand

E(1/X)=1Ni=1N1/Xi

it becomes obvious that, with N>1,

E(1/X)=1Ni=1N1/XiNi=1NXi=1/E(X)

Which leads to say that, basically, E(1/X)1/E(X) since the inverse of the (discrete) sum is not the (discrete) sum of inverses.

Analogously in the asymptotic 0-centered continuous case, one has

E(1/X)=f(x)xdx1/xf(x)dx=1/E(X).

당사 사이트를 사용함과 동시에 당사의 쿠키 정책개인정보 보호정책을 읽고 이해하였음을 인정하는 것으로 간주합니다.
Licensed under cc by-sa 3.0 with attribution required.