두 봉투 문제 재 방문


16

나는이 문제를 생각하고 있었다.

http://en.wikipedia.org/wiki/Two_envelopes_problem

나는 해결책을 믿고 그것을 이해한다고 생각하지만, 다음과 같은 접근법을 취하면 완전히 혼란 스럽다.

문제 1 :

나는 당신에게 다음 게임을 제공 할 것입니다. 당신 10 달러를 지불 하고 나는 공정한 동전을 뒤집을 것입니다. 내가 당신에게 $ 5주고 꼬리 내가 당신에게 $ 주고 20를줍니다.

기대치는 $ 12.5이므로 항상 게임을 즐길 수 있습니다.

문제 2 :

나는 당신에게 $ 10 의 봉투를 줄 것입니다 , 봉투가 열려 있고 확인할 수 있습니다. 나는 당신에게 또 다른 봉투를 보여주고, 이번에는 닫고 말해줍니다 :이 봉투에는 $ 5 또는 동일한 확률과 그것에 $ (20). 교환 하시겠습니까?

나는 이것이 문제 1과 정확히 같다고 생각합니다 . $ 5 또는 $ 20에 $ 10를 잊어 버렸습니다 . 따라서 항상 다시 전환 할 것입니다.

문제 3 :

위와 동일하지만 봉투를 닫습니다. 그래서 당신은 10 달러가 있지만 어느 정도는 X인지 알지 못합니다. 다른 봉투에는 두 배 또는 절반이 있다고합니다. 이제 동일한 논리를 따르면 전환하고 싶습니다. 이것은 봉투 역설입니다.

봉투를 닫으면 무엇이 바뀌 었나요?

편집하다:

어떤 사람들은 문제 3이 봉투 문제가 아니라고 주장했으며, 각 게임이 어떻게 보이는지 분석하여 그 이유가 무엇인지 아래에서 설명하려고합니다. 또한 게임을 더 잘 설정합니다.

문제 3에 대한 설명을 제공합니다.

게임을 조직하는 사람의 관점에서 :

봉투 2 장을 가지고 있습니다. 하나에 나는 $ 10을 닫고 그것을 플레이어에게 준다. 나는 그에게 말했다. 나는 단지 당신에게 준 봉투의 두 배 또는 절반의 봉투를 가지고있는 봉투를 하나 더 가지고 있다고 말한다. 전환 하시겠습니까? 그런 다음 공정한 동전을 뒤집어 쓰고 머리에 $ 5를 넣고 꼬리를 $ 20에 넣 습니다. 봉투를 건네줍니다. 나는 그에게 물었다. 당신이 나에게 준 봉투는 당신이 들고있는 봉투의 양의 두 배 또는 반을 가지고 있습니다. 전환 하시겠습니까?

플레이어의 관점에서 :

봉투가 주어졌으며 같은 확률로 금액의 두 배 또는 절반을 가진 다른 봉투가 있다고 말했습니다. 전환하고 싶습니다. 나는 가 있다고 확신 X하므로 12(12X+2X)>X이므로 전환하고 싶습니다. 나는 봉투를 얻었고 갑자기 나는 정확히 같은 상황에 직면하고 있습니다. 다른 봉투의 양이 두 배 또는 절반이므로 다시 전환하고 싶습니다.


2
적어도 나를 위해 주요 이해는 "나는 X를 가지고 있기 때문에 (1 / 2 * X + 2X) / 2> X"라고 말할 수 없다는 것입니다. 총 평균 확률은 50/50이지만 특정 X에 대해서는 , 예상 확률은 더 이상 50/50이 아닙니다. X가 클수록, 다른 엔벨로프에서 2 * X를 가질 가능성이 낮다 (양의 유한 분포); 그래서 가능한 X'es 위에 통합 sum(p(X) * (1/2X*f(X) + 2X(1-f(X)) ) = XF (X)이 큰 상기 제 1 포락선의 확률이며, 주어진 임의의 특정 X.
Peteris

1
역설의 진술에서, 실험자에 의해 X의 양이 선택되었다고 말하고 실험자는 무작위로 다른 봉투에 또는 X / 2 를 넣기로 결정합니다 . 두 개의 봉투 역설로 생성 한 상황을 계속 부풀려한다는 사실은 플레이어가 다른 봉투가 X / 2 또는 2 X 인 50/50 확률이 있다고 믿지 않는 이유를 이해하지 못한다는 것을 의미합니다 . 실제 두 엔벨로프 문제에서 2 X 가 다른 엔벨로프에 있을 확률 은 0 또는 1입니다.XX/2X/22X2X
jsk

네 말이 맞아 나는 이해하지 못한다. (따라서 질문. 나는 내가 언급 한 문제 3과 봉투 역설의 차이점을 이해하려고 노력하고 있습니다. 역설에는 X와 2X의 봉투가 두 개 있다는 것을 이해합니다. t는 그것이 다른 사람에게 봉투를주고 다른 금액을 넣기로 결정하기 위해 동전을 뒤집는 것과 다른 점을
보라

1
이에 대한 트릭은 하나하는 결함 가정이다 또는 2 X 결과가 똑같이 가능성이있다. 2 X 가 다른 엔벨로프에 있으면 전환에서 예상되는 이득은 2 X - X = X 입니다. 경우에 X / 2는 다른 봉투에 다음 스위치의 예상 이익은 X / 2 - X = - X / 2 . 플레이어는 자신이 어떤 상황에 처해 있는지 알지 못하지만 50/50의 기회가 있다고 생각해서는 안됩니다. X/22X2X2XX=XX/2X/2X=X/2
jsk

1
봉투에 2 X가 포함되어 있다고 가정 해 봅시다 . 만약 끝낼 경우 X , 그 확률 것이 2 X는 다른 봉투가 1이고, 확률에 X / 2 가 끝낼 경우 다른 봉투에가 0 인 2 X 그 후 확률을 2 * ( 2 X ) = 4 X 는 다른 엔벨로프에 있고 2 X / 2 = X 가 다른 엔벨로프에 있을 확률 은 1입니다.X2XX2XX/22X2(2X)=4X2X/2=X
jsk

답변:


23

1. 불필요한 확률.

이 노트의 다음 두 섹션은 표준 의사 결정 도구를 사용하여 "더 큰 추측"과 "두 봉투"문제를 분석합니다 (2). 이 방법은 간단하지만 새로운 방법으로 보입니다. 특히, "항상 스위치"또는 "절대 스위치"절차보다 명백히 우수한 두 엔벌 로프 문제에 대한 일련의 결정 절차를 식별합니다.

섹션 2에서는 (표준) 용어, 개념 및 표기법을 소개합니다. "더 큰 문제인 추측"에 대해 가능한 모든 결정 절차를 분석합니다. 이 자료에 익숙한 독자는이 섹션을 건너 뛰어도됩니다. 섹션 3은 두 엔벨로프 문제에 유사한 분석을 적용합니다. 4 장 결론은 요점을 요약 한 것입니다.

이 퍼즐에 대한 모든 출판 된 분석은 가능한 자연 상태를 지배하는 확률 분포가 있다고 가정합니다. 그러나이 가정은 퍼즐 진술의 일부가 아닙니다. 이러한 분석의 핵심 아이디어는이 (보증되지 않은) 가정을 삭제하면 이러한 퍼즐의 명백한 역설을 간단히 해결할 수 있다는 것입니다.

2.“더 큰 것”문제.

실험자는 다른 실수 x 2 가 있다고 들었습니다.x1x2 가 두 장의 종이에 쓰여졌다 . 그녀는 무작위로 선택한 전표의 숫자를 봅니다. 이 하나의 관찰만을 기반으로, 두 숫자 중 더 작은 지 큰지를 결정해야합니다.

확률에 대한 이와 같은 단순하지만 개방형 문제는 혼란스럽고 반 직관적 인 것으로 악명이 높습니다. 특히, 확률이 그림에 들어가는 적어도 세 가지 뚜렷한 방법이 있습니다. 이를 명확히하기 위해 공식적인 실험 관점 (2)을 채택하자.

손실 함수 를 지정하여 시작하십시오 . 우리의 목표는 아래에 정의 된 의미에서 기대치를 최소화하는 것입니다. 실험자가 올바르게 추측하면 손실을 , 그렇지 않으면 0을 선택하는 것이 좋습니다 . 이 손실 함수의 기대치는 잘못 추측 할 확률입니다. 일반적으로 다양한 추측을 잘못된 추측에 할당함으로써 손실 함수는 올바르게 추측하는 목표를 포착합니다. 확실히, 손실 함수를 채택하는 것은 x 1x 2 에 대한 사전 확률 분포를 가정하는 것만 큼 임의적입니다. 10x1x2더 자연스럽고 근본적입니다. 우리가 결정을 내릴 때 자연스럽게 옳고 그른 결과를 고려합니다. 어느 쪽도 결과가 없다면 왜 걱정해야합니까? 우리는 (이론적) 결정을 내릴 때마다 잠재적 손실에 대해 암묵적으로 수행하므로 손실에 대한 명시적인 고려를 통해 이익을 얻는 반면, 용지 명세서에 가능한 값을 설명 할 확률을 사용하는 것은 불필요하고 인공적이며 우리는 유용한 해결책을 얻지 못하게 할 수 있습니다.

의사 결정 이론은 관찰 결과와 그에 대한 분석을 모델링합니다. 샘플 공간, "자연 상태"및 결정 절차의 세 가지 추가 수학적 객체를 사용합니다.

  • 샘플 공간 는 가능한 모든 관측치로 구성됩니다. 여기서 R (실수 집합) 로 식별 할 수 있습니다 . SR

  • 자연 상태 은 실험 결과를 지배하는 가능한 확률 분포입니다. (이것은 사건의“확률”에 대해 이야기 할 수있는 첫 번째 의미입니다.)“더 큰 추측”문제에서, 이들은 동일한 확률 로 고유 한 실수 x 1x 2 에서 값을 취하는 이산 분포입니다. 의 1Ωx1x2각 값에서 2 입니다. Ω가 매개 변수화 될 수있다{ω=(X1,X2)R×R| x1>x2}.12Ω{ω=(x1,x2)R×R | x1>x2}.

  • 결정 공간은 이진 세트이다 가능한 결정.Δ={smaller,larger}

이 용어에서 손실 함수는 Ω × Δ 에 정의 된 실제 값 함수입니다.Ω×Δ 입니다. 그것은 현실 (첫 번째 주장)과 비교하여 결정이 얼마나 나쁜지 (두 번째 주장)를 알려줍니다.

실험자가 이용할 수 있는 가장 일반적인 결정 절차 무작위결정됩니다 . 실험 결과에 대한 값은 Δ 의 확률 분포입니다 . 즉, 결과 x 를 관찰 할 때의 결정 은 반드시 명확한 것이 아니라 분포 δ ( x ) 에 따라 무작위로 선택되어야한다 . (이는 확률과 관련된 두 번째 방법입니다.)δΔxδ(x)

두 요소를 가지고, 어떤 무작위 절차는 우리가 가지고 콘크리트로 할 수는 미리 지정된 결정에 할당 확률로 식별 할 수 있습니다 "더." Δ

Spinner

물리적 스피너 구현 등의 바이너리 무작위 절차를 : 자유롭게 - 회전 포인터가 하나 개의 의사 결정에 해당하는 상단 영역에 정지합니다 확률로, δ 확률로 왼쪽 하단에 중지 그렇지 않으면 및 1 - δ ( x ) . 스피너 완전히 값 지정에 의해 결정되는 δ ( X를 ) [ 0 , 1 ] .Δδ1δ(x)δ(x)[0,1]

따라서 결정 절차는 기능으로 생각할 수 있습니다

δ:S[0,1],

어디

Prδ(x)(larger)=δ(x)  and  Prδ(x)(smaller)=1δ(x).

반대로, 그러한 기능 는 무작위 결정 절차를 결정한다. 무작위 결정은 특별한 경우에 결정적 의사 결정을 포함하는 경우의 범위 δ ' 에서 거짓말 { 0 , 1 } .δδ{0,1}

결과 x에 대한 결정 절차 δ비용δ ( x ) 의 예상 손실 이라고 가정하자 . 결정 공간 Δ 에 대한 확률 분포 δ ( x ) 에 대한 기대가있다 . 각각의 자연 상태 ( ω) (샘플 공간 ( S ) 상의 이항 확률 분포 )는 임의의 절차 ( δ )의 예상 비용을 결정하고 ; 이것은 ω 에 대한 δ위험 이며 , 위험 δ ( ω )δxδ(x)δ(x)ΔωSδδωRiskδ(ω). Here, the expectation is taken with respect to the state of nature ω.

Decision procedures are compared in terms of their risk functions. When the state of nature is truly unknown, ε and δ are two procedures, and Riskε(ω)Riskδ(ω) for all ω, then there is no sense in using procedure ε, because procedure δ is never any worse (and might be better in some cases). Such a procedure ε is inadmissible; otherwise, it is admissible. Often many admissible procedures exist. We shall consider any of them “good” because none of them can be consistently out-performed by some other procedure.

( (1)의 용어에서 " C에 대한 혼합 전략") 에는 사전 분배가 도입되지 않습니다 . 이것이 확률이 문제 설정의 일부일 수있는 세 번째 방법입니다. 이를 사용하면 현재 분석이 (1) 및 그 참조보다 더 일반적이면서도 단순 해집니다.ΩC

표 1 평가하여 자연의 실제 상태가 주어진다 위험 리콜 X 1 > X 2 .ω=(x1,x2).x1>x2.

1 번 테이블.

Decision:LargerLargerSmallerSmallerOutcomeProbabilityProbabilityLossProbabilityLossCostx11/2δ(x1)01δ(x1)11δ(x1)x21/2δ(x2)11δ(x2)01δ(x2)

Risk(x1,x2): (1δ(x1)+δ(x2))/2.

In these terms the “guess which is larger” problem becomes

x1x2δ[1δ(max(x1,x2))+δ(min(x1,x2))]/2 is surely less than 12?

이 문장은 를 요구하는 것과 같습니다. δ(x)>δ(y) whenever x>y. Whence, it is necessary and sufficient for the experimenter's decision procedure to be specified by some strictly increasing function δ:S[0,1]. This set of procedures includes, but is larger than, all the “mixed strategies Q” of 1. There are lots of randomized decision procedures that are better than any unrandomized procedure!

3. THE “TWO ENVELOPE” PROBLEM.

It is encouraging that this straightforward analysis disclosed a large set of solutions to the “guess which is larger” problem, including good ones that have not been identified before. Let us see what the same approach can reveal about the other problem before us, the “two envelope” problem (or “box problem,” as it is sometimes called). This concerns a game played by randomly selecting one of two envelopes, one of which is known to have twice as much money in it as the other. After opening the envelope and observing the amount x of money in it, the player decides whether to keep the money in the unopened envelope (to “switch”) or to keep the money in the opened envelope. One would think that switching and not switching would be equally acceptable strategies, because the player is equally uncertain as to which envelope contains the larger amount. The paradox is that switching seems to be the superior option, because it offers “equally probable” alternatives between payoffs of 2x and x/2, whose expected value of 5x/4 exceeds the value in the opened envelope. Note that both these strategies are deterministic and constant.

In this situation, we may formally write

S={xR | x>0},Ω={Discrete distributions supported on {ω,2ω} | ω>0 and Pr(ω)=12},andΔ={Switch,Do not switch}.

As before, any decision procedure δ can be considered a function from S to [0,1], this time by associating it with the probability of not switching, which again can be written δ(x). The probability of switching must of course be the complementary value 1δ(x).

The loss, shown in Table 2, is the negative of the game's payoff. It is a function of the true state of nature ω, the outcome x (which can be either ω or 2ω), and the decision, which depends on the outcome.

Table 2.

LossLossOutcome(x)SwitchDo not switchCostω2ωωω[2(1δ(ω))+δ(ω)]2ωω2ωω[1δ(2ω)+2δ(2ω)]

In addition to displaying the loss function, Table 2 also computes the cost of an arbitrary decision procedure δ. Because the game produces the two outcomes with equal probabilities of 12, the risk when ω is the true state of nature is

Riskδ(ω)=ω[2(1δ(ω))+δ(ω)]/2+ω[1δ(2ω)+2δ(2ω)]/2=(ω/2)[3+δ(2ω)δ(ω)].

A constant procedure, which means always switching (δ(x)=0) or always standing pat (δ(x)=1), will have risk 3ω/2. Any strictly increasing function, or more generally, any function δ with range in [0,1] for which δ(2x)>δ(x) for all positive real x, determines a procedure δ having a risk function that is always strictly less than 3ω/2 and thus is superior to either constant procedure, regardless of the true state of nature ω! The constant procedures therefore are inadmissible because there exist procedures with risks that are sometimes lower, and never higher, regardless of the state of nature.

Strategy

Comparing this to the preceding solution of the “guess which is larger” problem shows the close connection between the two. In both cases, an appropriately chosen randomized procedure is demonstrably superior to the “obvious” constant strategies.

These randomized strategies have some notable properties:

  • There are no bad situations for the randomized strategies: no matter how the amount of money in the envelope is chosen, in the long run these strategies will be no worse than a constant strategy.

  • No randomized strategy with limiting values of 0 and 1 dominates any of the others: if the expectation for δ when (ω,2ω) is in the envelopes exceeds the expectation for ε, then there exists some other possible state with (η,2η) in the envelopes and the expectation of ε exceeds that of δ .

  • The δ strategies include, as special cases, strategies equivalent to many of the Bayesian strategies. Any strategy that says “switch if x is less than some threshold T and stay otherwise” corresponds to δ(x)=1 when xT,δ(x)=0 otherwise.

What, then, is the fallacy in the argument that favors always switching? It lies in the implicit assumption that there is any probability distribution at all for the alternatives. Specifically, having observed x in the opened envelope, the intuitive argument for switching is based on the conditional probabilities Prob(Amount in unopened envelope | x was observed), which are probabilities defined on the set of underlying states of nature. But these are not computable from the data. The decision-theoretic framework does not require a probability distribution on Ω in order to solve the problem, nor does the problem specify one.

This result differs from the ones obtained by (1) and its references in a subtle but important way. The other solutions all assume (even though it is irrelevant) there is a prior probability distribution on Ω and then show, essentially, that it must be uniform over S. That, in turn, is impossible. However, the solutions to the two-envelope problem given here do not arise as the best decision procedures for some given prior distribution and thereby are overlooked by such an analysis. In the present treatment, it simply does not matter whether a prior probability distribution can exist or not. We might characterize this as a contrast between being uncertain what the envelopes contain (as described by a prior distribution) and being completely ignorant of their contents (so that no prior distribution is relevant).

4. CONCLUSIONS.

In the “guess which is larger” problem, a good procedure is to decide randomly that the observed value is the larger of the two, with a probability that increases as the observed value increases. There is no single best procedure. In the “two envelope” problem, a good procedure is again to decide randomly that the observed amount of money is worth keeping (that is, that it is the larger of the two), with a probability that increases as the observed value increases. Again there is no single best procedure. In both cases, if many players used such a procedure and independently played games for a given ω, then (regardless of the value of ω) on the whole they would win more than they lose, because their decision procedures favor selecting the larger amounts.

In both problems, making an additional assumption-—a prior distribution on the states of nature—-that is not part of the problem gives rise to an apparent paradox. By focusing on what is specified in each problem, this assumption is altogether avoided (tempting as it may be to make), allowing the paradoxes to disappear and straightforward solutions to emerge.

REFERENCES

(1) D. Samet, I. Samet, and D. Schmeidler, One Observation behind Two-Envelope Puzzles. American Mathematical Monthly 111 (April 2004) 347-351.

(2) J. Kiefer, Introduction to Statistical Inference. Springer-Verlag, New York, 1987.


8
This is a short article I wrote ten years ago but never published. (The new editor of the AMM saw no mathematical interest in it.) I have given talks in which I played the two-envelope game with the audience, using substantial amounts of real money.
whuber

1
Very nice write up! Joe Blitzstein talked about the two evelope problem in a Harvard Stat 110 lecture which is available free on youtube if anyone is interested btw.
Benjamin Lindqvist

@whuber Consider this variant. Suppose I choose two amounts of money such that one is twice as much as the other. Then I flip a fair coin to decide which amount goes in which envelope. Now you pick an envelope at random, and imagine the amount inside it, calling it x (if this step is questionable, consider the case of opening up the envelope and looking at the actual amount - since the reasoning applies no matter what value you see inside, it should apply with a general x). Then calculate the expected value of the money in the other envelope as E=(1/2)(x/2)+(1/2)(2x)=1.25x>x...
Zubin Mukerjee

I guess I don't understand where in that reasoning I "assumed a prior distribution on the states of nature". Did I? Clearly the reasoning cannot be correct, because I cannot justify switching to the other envelope by merely thinking about the first envelope (since the same logic would apply to the second, once I switch once).
Zubin Mukerjee

2
@Zubin There is a basic (but interesting) mistake in that analysis. Let θ be the smaller amount in the two envelopes. Given an observation of x, you know that either θ=x or θ=x/2 and that the likelihood of this observation in either case is 1/2. In the former case the amount Y in the other envelope is 2x and in the latter case it is x/2, but in order to assign a valid expectation to Y you must assume there is some probability distribution for θ. Equal likelihood is not equivalent to equal probability.
whuber

7

The issue in general with the two envelope problem is that the problem as presented on wikipedia allows the size of the values in the envelopes to change after the first choice has been made. The problem has been formulized incorrectly.

However, a real world formulation of the problem is this: you have two identical envelopes: A and B, where B=2A. You can pick either envelope and then are offered to swap.

Case 1: You've picked A. If you switch you gain A dollars.

Case 2: You've picked B. If you switch you loose A dollars.

This is where the flaw in the two-envelope paradox enters in. While you are looking at loosing half the value or doubling your money, you still don't know the original value of A and the value of A has been fixed. What you are looking at is either +A or A, not 2A or 12A.

If we assume that the probability of selecting A or B at each step is equal,. the after the first offered swap, the results can be either:

Case 1: Picked A, No swap: Reward A

Case 2: Picked A, Swapped for B: Reward 2A

Case 3: Picked B, No swap: Reward 2A

Case 4: Picked B, Swapped for A: Reward A

The end result is that half the time you get A and half the time you get 2A. This will not change no matter how many times you are offered a swap, nor will it change based upon knowing what is in one envelope.


IMO, the problem says that you cannot lose A no matter what. So, your +A vs -A cannot be appropriate. You either win A or 2A.
Little Alien

7

My interpretation of the question

I am assuming that the setting in problem 3 is as follows: the organizer first selects amount X and puts X in the first envelope. Then, the organizer flips a fair coin and based on that puts either 0.5X or 2X to the second envelope. The player knows all this, but not X nor the result of the coin-flip. The organizer gives the player the first envelope (closed) and asks if the player wants to switch. The questioner argues 1. that the player wants to switch because the switching increases expectation (correct) and 2. that after switching, the same reasoning symmetrically holds and the player wants to switch back (incorrect). I also assume the player is a rational risk-neutral Bayesian agent that puts a probability distribution over X and maximizes expected amount of money earned.

Note that if the we player did not know about the coin-flip procedure, there might be no reason in the first place to argue that the probabilities are 0.5 for the second envelope to be higher/lower.

Why there is no paradox

Your problem 3 (as interpreted in my answer) is not the envelope paradox. Let the Z be a Bernoulli random variable with P(Z=1)=0.5. Define the amount Y in the 2nd envelope so that Z=1 implies Y=2X and Z=0 implies Y=0.5X. In the scenario here, X is selected without knowledge of the result of the coin-flip and thus Z and X are independent, which implies E(YX)=1.25X.

E(Y)=E(E(YX))=E(1.25X)=1.25E(X)
Thus, if if X>0 (or at least E(X)>0), the player will prefer to switch to envelope 2. However, there is nothing paradoxical about the fact that if you offer me a good deal (envelope 1) and an opportunity to switch to a better deal (envelope 2), I will want to switch to the better deal.

To invoke the paradox, you would have to make the situation symmetric, so that you could argue that I also want to switch from envelope 2 to envelope 1. Only this would be the paradox: that I would want to keep switching forever. In the question, you argue that the situation indeed is symmetric, however, there is no justification provided. The situation is not symmetric: the second envelope contains the amount that was picked as a function of a coin-flip and the amount in the first envelope, while the amount in the first envelope was not picked as a function of a coin-flip and the amount in the second envelope. Hence, the argument for switching back from the second envelope is not valid.

Example with small number of possibilities

Let us assume that (the player's belief is that) X=10 or X=40 with equal probabilities, and work out the computations case by case. In this case, the possibilities for (X,Y) are {(10,5),(10,20),(40,20),(40,80)}, each of which has probability 1/4. First, we look at the player's reasoning when holding the first envelope.

  1. If my envelope contains 10, the second envelope contains either 5 or 20 with equal probabilities, thus by switching I gain on average 0.5×(5)+0.5×10=2.5.
  2. If my envelope contains 40, the second envelope contains either 20 or 80 with equal probabilities, thus by switching I gain on average 0.5×(20)+0.5×(40)=10.

Taking the average over these, the expected gain of switching is 0.5×2.5+0.5×10=6.25, so the player switches. Now, let us make similar case-by-case analysis of switching back:

  1. If my envelope contains 5, the old envelope with probability 1 contains 10, and I gain 5 by switching.
  2. If my envelope contains 20, the old envelope contains 10 or 40 with equal probabilities, and by switching I gain 0.5×(10)+0.5×20=5.
  3. If my envelope contains 80, the old envelope with probability 1 contains 40 and I lose 40 by switching.

Now, the expected value, i.e. probability-weighted average, of gain by switching back is 0.25×5+0.5×5+0.25×(40)=6.25. So, switching back exactly cancels the expected utility gain.

Another example with a continuum of possibilities

You might object to my previous example by claiming that I maybe cleverly selected the distribution over X so that in the Y=80 case the player knows that he is losing. Let us now consider a case where X has a continuous unbounded distribution: XExp(1), Z independent of X as previously, and Y as a function of X and Z as previously. The expected gain of switching from X to Y is again E(0.25X)=0.25E(X)=0.25. For the back-switch, we first compute the conditional probability P(X=0.5YY=y) using Bayes' theorem:

P(X=0.5YY=y)=P(Z=1Y=y)=p(Y=yZ=1)P(Z=1)p(Y=y)=p(2X=y)P(Z=1)p(Y=y)=0.25e0.5yp(Y=y)
and similarly P(X=2YY=y)=e2yp(Y=y), wherefore the conditional expected gain of switching back to the first envelope is
E(XYY=y)=0.125ye0.5y+ye2yp(Y=y),
and taking the expectation over Y, this becomes
E(XY)=00.125ye0.5y+ye2yp(Y=y)p(Y=y)dy=0.25,
which cancels out the expected gain of the first switch.

General solution

The situation seen in the two examples must always occur: you cannot construct a probability distribution for X,Z,Y with these conditions: X is not a.s. 0, Z is Bernoulli with P(Z=1)=0.5, Z is independent of X, Y=2X when Z=1 and 0.5X otherwise and also Y,Z are independent. This is explained in the Wikipedia article under heading 'Proposed resolutions to the alternative interpretation': such a condition would imply that the probability that the smaller envelope has amount between 2n,2n+1 (P(2n<=min(X,Y)<2n+1) with my notation) would be a constant over all natural numbers n, which is impossible for a proper probability distribution.

Note that there is another version of the paradox where the probabilities need not be 0.5, but the expectation of other envelope conditional on the amount in this envelope is still always higher. Probability distributions satisfying this type of condition exist (e.g., let the amounts in the envelopes be independent half-Cauchy), but as the Wikipedia article explains, they require infinite mean. I think this part is rather unrelated to your question, but for completeness wanted to mention this.


I edited my question trying to explain why I think it is similar to the envelope paradox and you would want to switch forever.
evan54

@evan54 I rewrote my answer to contain my interpretation of the setting problem 3, more explanation about why the situation is not symmetric, examples etc.
Juho Kokkala

I think I'm close to getting it. I think that once there is a coin flip and envelope 2 contains half/double the amount in your hand you are basically in the situation of the envelope paradox BUT the way you got there guarantees you that you are better off switching. Does that make sense?
evan54

also, if it does, is there a way to make it more formal? I may ponder on it more..
evan54

1
@evan54 Not sure. The whole point of the paradox is that it is a situation in which there is no advantage to switching. Thus, anything you change to the setup of the problem that results in it being advantageous to switch, at least initially, must therefore not be equivalent to the setup of the two envelope paradox. Note that in your setup, it only makes sense to switch the very first time. After you switch the first time, you expect to lose by switching back. The flawed logic in the paradox comes into play if you attempt to argue that you should switch back.
jsk

4

Problem 1: Agreed, play the game. The key here is that you know the actual probabilities of winning 5 vs 20 since the outcome is dependent upon the flip of a fair coin.

Problem 2: The problem is the same as problem 1 because you are told that there is an equal probability that either 5 or 20 is in the other envelope.

Problem 3: The difference in problem 3 is that telling me the other envelope has either X/2 or 2X in it does not mean that I should assume that the two possibilities are equally likely for all possible values of X. Doing so implies an improper prior on the possible values of X. See the Bayesian resolution to the paradox.


I see we interpret problem 3 slightly differently. I assumed OP specifically constructs the setting in problem 3 so that the 2nd envelope has probabilities 0.5/0.5. This is clearly possible without improper distributions, but then the possibilities for envelope 1 are not equally likely given the amount in the second envelope.
Juho Kokkala

Agreed, if OP meant that you are told that the other envelope either has X/2 or 2X with equal probabilities, then problem 3 would not be equivalent to the 2 envelope paradox.
jsk

yes that was my thinking, that in problem 3 there is equal probability between X/2 and 2X. So you hold 3 envelopes give him the 10 and then flip a coin to see if you give him the 20 or 5 (they are closed) if he decides to switch
evan54

1
@evan54 - if you make the random flip after you choose which envelope to give me, then it's equivalent to problem 1; if you choose both amounts of money, and then make a random flip on which envelope you give me, then it's the situation described above; they're different situations.
Peteris

1
@evan54 - the optimal player's decision depends on how you made those envelopes. If you don't tell the player how you did that (only that 50/50 sentence), then the optimal strategy depends on player's assumptions on how likely you are to do it one way or another - the first envelope you prepared is less valuable than the second envelope you prepared; if they were fairly shuffled (and unopened) then it doesn't matter what the player chooses; if the player thinks that you likely (>50%) initially gave him the first envelope, then player should switch and stick with that.
Peteris

1

This is a potential explanation that I have. I think it is wrong but I'm not sure. I will post it to be voted on and commented on. Hopefully someone will offer a better explanation.

So the only thing that changed between problem 2 and problem 3 is that the amount became in the envelope you hold became random. If you allow that amount to be negative so there might be a bill there instead of money then it makes perfect sense. The extra information you get when you open the envelope is whether it's a bill or money hence you care to switch in one case while in the other you don't.

If however you are told the bill is not a possibility then the problem remains. (of course do you assign a probability that they lie?)


Introducing the possibility of negative amounts is an interesting observation, but not needed for resolving the issue in your question. See my answer.
Juho Kokkala

It is not necessary to assume the amount in the envelope is random: it suffices that it is unknown. Assuming randomness adduces information--however little it might be--that was not given in the problem!
whuber

1
The biggest difference between 2 and 3 is that being told the other amount is either X/2 or 2X is not the same as being told that the two possibilities are equally likely. Assuming the two amounts are equally likely is not the same as being told the two amounts are equally likely.
jsk

1

Problem 2A: 100 note cards are in an opaque jar. "$10" is written on one side of each card; the opposite side has either "$5" or "$20" written on it. You get to pick a card and look at one side only. You then get to choose one side (the revealed, or the hidden), and you win the amount on that side.

If you see "$5," you know you should choose the hidden side and will win $10. If you see "$20," you know you should choose the revealed side and will win $20. But if you see "$10," I have not given you enough information calculate an expectation for the hidden side. Had I said there were an equal number of {$5,$10} cards as {$10,$20} cards, the expectation would be $12.50. But you can't find the expectation from only the fact - which was still true - that you had equal chances to reveal the higher, or lower, value on the card. You need to know how many of each kind of card there were.

Problem 3A: The same jar is used, but this time the cards all have different, and unknown, values written on them. The only thing that is the same, is that on each card one side is twice the value of the other.

Pick a card, and a side, but don't look at it. There is a 50% chance that it is the higher side, or the lower side. One possible solution is that the card is either {X/2,X} or {X,2X} with 50% probability, where X is your side. But we saw above that the the probability of choosing high or low is not the same thing as these two different cards being equally likely to be in the jar.

What changed between your Problem 2 and Problem 3, is that you made these two probabilities the same in Problem 2 by saying "This envelope either has $5 or $20 in it with equal probability." With unknown values, that can't be true in Problem 3.


0

Overview

I believe that they way you have broken out the problem is completely correct. You need to distinguish the "Coin Flip" scenario, from the situation where the money is added to the envelope before the envelope is chosen

Not distinguishing those scenarios lies at the root of many people's confusion.

Problem 1

If you are flipping a coin to decide if either double your money or lose half, always play the game. Instead of double or nothing, it is double or lose some.

Problem 2

This is exactly the same as the coin flip scenario. The only difference is that the person picking the envelope flipped before giving you the first envelope. Note You Did Not Choose an Envelope!!!! You were given one envelope, and then given the choice to switch This is a subtle but important difference over problem 3, which affects the distribution of the priors

Problem 3

This is the classical setup to the two envelope problem. Here you are given the choice between the two envelopes. The most important points to realize are

  • There is a maximum amount of money that can be in the any envelope. Because the person running the game has finite resources, or a finite amount they are willing to invest
  • If you call the maximum money that could be in the envelope M, you are not equally likely to get any number between 0 and M. If you assume a random amount of money between 0 and M was put in the first envelope, and half of that for the second (or double, the math still works) If you open an envelope, you are 3 times as likely to see something less than M/2 than above M/2. (This is because half the time both envelopes will have less than M/2, and the other half the time 1 envelope will)
  • Since there is not an even distribution, the 50% of the time you double, 50% of the time you cut in half doesn't apply
  • When you work out the actual probabilities, you find the expected value of the first envelope is M/2, and the EV of the second envelope, switching or not is also M/2

Interestingly, if you can make some guess as to what the maximum money in the envelope can be, or if you can play the game multiple times, then you can benefit by switching, whenever you open an envelope less than M/2. I have simulated this two envelope problem here and find that if you have this outside information, on average you can do 1.25 as well as just always switching or never switching.

당사 사이트를 사용함과 동시에 당사의 쿠키 정책개인정보 보호정책을 읽고 이해하였음을 인정하는 것으로 간주합니다.
Licensed under cc by-sa 3.0 with attribution required.