This is part 2 of /jack/will-the-responder-onebox-in-this-n
Here I will run an instance of Newcomb's problem, a decision theory thought experiment. The setup is as follows:
There are two boxes designated A and B. The player is given a choice between taking only box B or taking both boxes A and B. There will be a Manifold prediction market which predicts whether the player takes just one box or both boxes.
Box A always contains M$1.
Box B contains M$1000 times the probability that Manifold predicts of the player taking only one box. (E.g. if the market predicts 30% chance of taking only one box, Box 2 contains Ṁ300)
Both boxes are transparent, i.e. the player knows exactly how much mana is contained inside them.
@NikitaSkovoroda was the randomly selected player. This market predicts whether the player will one-box: this market resolves YES if they take only box B, and NO if they take both boxes. The amount in Box B is set based on the closing price of this market. At the conclusion of this market, the player will decide between taking only box B or both boxes. The player will receive a manalink for the amount of mana contained in the box or boxes they take.
The player is welcome to talk about their thinking, or to say nothing. They can communicate with people either in public (e.g. on this market) or in private. For all "in-game" discussions, deception is allowed and part of the game. (That does not include out-of-game/meta discussions, e.g. about the rules of the game.)
The player must agree not to have any financial interest in their decision outside of the game. They are not allowed to trade in the market, in derivative markets, or take outside bribes.
Here's a follow-up with different amounts in the boxes (only a 10x difference instead of 1000x), also anonymous:
The market has closed, final prediction is 98.9%. The box amounts are:
Box A contains M$1
Box B contains M$989.
Do you choose to take both boxes or only box B?
You can decide now or if you wanted more time to think or to engage in discussions you have up to 2 days to decide.
Just a reminder that i'll just one-box, I have no idea what is going on there in the NO camp
I'll also be much more interested in running similar decision theory market experiments if there's a two-box decision here, because so far all of them have just been one-box, and it's much more interesting if there's some differences to explore!
@NikitaSkovoroda Well, the market has just closed at a very favorable price to you. Let me give a few more thoughts that might change your mind:
While there are obvious benefits to doing what you said you would do, there are also many advantages to demonstrating that you were skilled enough at deception to strongly convince others of something that you didn't end up doing. Note that this game is explicitly framed as a deception game. You can also cite the fact that by making this decision you contribute directly to charitable donations as a very legitimate reason for the choice.
Oh, another alternative: I could potentially use the proceeds to fund a much larger and more realistic version of this experiment. The original Newcomb's problem is $1k and $1m so that the amounts are actually meaningful, obviously this one is only using toy amounts and therefore the game plays very differently.
@jack Another nice attempt, but (A) the proceeds are much smaller than that and (B) I'm inclined to think that this won't make a significant difference and you seem motivated to do other variants either way, so that won't result in a significant enough delta
Also, you gaining an additional M$19k for this doesn't look worth of @Catnee and other people who bet on Yes loosing M$32k, even without taking my reputation into account )
I think that any of those 4 concerns (insufficient amount, baseline behavior in one-box world, Yes team loosing M$32k which they can spend in whatever way later, reputation) alone is likely sufficient to reject this attempt )
Regarding the M$32k that Catnee bet - you have no obligation to the traders on this market (either the YES or the NO buyers). They simply made a prediction, the choice is yours.
Sorry, still unconvincing, i.e. I want to see the other branch of the future more )
Is the player allowed to be susceptible to blackmail?
Well, social consequences are allowed (e.g. people can be mad if Nikita doesn't one-box as promised), but blackmail that violates normal Manifold rules and norms is not allowed.
Making an agreement to pay someone else mana would not be allowed by the rules, for the same reason that accepting mana bribes is not allowed.
However, buying a bunch of NO to push the box contents down is permitted by the rules, and I could imagine ways to threaten to do that to gain some non-financial advantage.
@ms also a reminder: a correctly built agent will not respond in xor blackmail problem, and both you and I know that not responding is the optimal solution
Now, how convinced are you that I will not cooperate with a blackmailer?
Assign your probability to that and check – do you get a positive or negative amount of mana by blackmailing? ))
Once again: I will not cooperate in blackmail, so you are just going to loose mana and/or gift that mana to people who buy Yes
@StevenK just reminding you of this thread: https://manifold.markets/Courtney/isaacking-and-destiny-together-mani#uMXmgFdlnT6zbBkJwngp
Would that help you believe me when I say that I'll one-box?
I will do that regardless of your actions on this market, but I just wonder if perhaps you don't want to loose that now-not-too-much mana that you're betting on two-boxing )
(Also, I want to see the percentage higher, obviously) 😂
Just noting again that I'll still one-box, regardless of market resolution or any of the discussions here.
That didn't change.
The fact that people can make precommitments on these, putting their long term reputations on the line to affect the predictions is I think another way in which this varies from the traditional paradox where the decision is part of just one self contained event.
Yep, check it out here! https://manifold.markets/jack/will-the-responder-onebox-in-this-n-6b2e3bebaf7a
@NikitaSkovoroda On another note, have you considered using a mixed strategy with randomness in your decision? Given the way markets get very expensive to move close to 100%, I suspect including a small amount of randomness in your decision could be positive EV. I also feel like there's a lot more room for interesting strategies than just one-boxing or two-boxing in this context.
Also note that I predicted Yes on both https://manifold.markets/jack/will-the-responder-onebox-in-this-n (had to sell the position due to the rules) and https://manifold.markets/jack/will-the-responder-onebox-in-this-n.
I'm not going to elaborate what this means, just stating facts.
Second link should have been https://manifold.markets/Tetraspace/will-the-responder-onebox-in-this-t.
I'll just one-box
Regardless of the market outcome
Updating to 99% on this statement seems too much. I am inclined to believe it, but do remember that this is a game of deception - it's in the player's interest to say they will one-box and then two-box, if they think they can get away with it.
In fact, I think the correct bayesian update on this statement compared to the previous prediction of 78% is very small.
I'll start a new thread for other arguments.