This is part 2 of /jack/will-the-responder-onebox-in-this-n
Here I will run an instance of Newcomb's problem, a decision theory thought experiment. The setup is as follows:
There are two boxes designated A and B. The player is given a choice between taking only box B or taking both boxes A and B. There will be a Manifold prediction market which predicts whether the player takes just one box or both boxes.
Box A always contains M$1.
Box B contains M$1000 times the probability that Manifold predicts of the player taking only one box. (E.g. if the market predicts 30% chance of taking only one box, Box 2 contains Ṁ300)
Both boxes are transparent, i.e. the player knows exactly how much mana is contained inside them.
@NikitaSkovoroda was the randomly selected player. This market predicts whether the player will one-box: this market resolves YES if they take only box B, and NO if they take both boxes. The amount in Box B is set based on the closing price of this market. At the conclusion of this market, the player will decide between taking only box B or both boxes. The player will receive a manalink for the amount of mana contained in the box or boxes they take.
Rules:
The player is welcome to talk about their thinking, or to say nothing. They can communicate with people either in public (e.g. on this market) or in private. For all "in-game" discussions, deception is allowed and part of the game. (That does not include out-of-game/meta discussions, e.g. about the rules of the game.)
The player must agree not to have any financial interest in their decision outside of the game. They are not allowed to trade in the market, in derivative markets, or take outside bribes.
Related
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ3,199 | |
2 | Ṁ122 | |
3 | Ṁ47 | |
4 | Ṁ46 | |
5 | Ṁ38 |

Here's a follow-up with different amounts in the boxes (only a 10x difference instead of 1000x), also anonymous:

The market has closed, final prediction is 98.9%. The box amounts are:
Box A contains M$1
Box B contains M$989.
Do you choose to take both boxes or only box B?
You can decide now or if you wanted more time to think or to engage in discussions you have up to 2 days to decide.



Just a reminder that i'll just one-box, I have no idea what is going on there in the NO camp

@NikitaSkovoroda It's pretty simple: I put up some cheap limit orders at 99%, because I think there's more than a 1% chance that you two-box.
I'll give you one more reason for it: if you two-box, I will donate the proceeds to charity.

I'll also be much more interested in running similar decision theory market experiments if there's a two-box decision here, because so far all of them have just been one-box, and it's much more interesting if there's some differences to explore!


@NikitaSkovoroda Well, the market has just closed at a very favorable price to you. Let me give a few more thoughts that might change your mind:
While there are obvious benefits to doing what you said you would do, there are also many advantages to demonstrating that you were skilled enough at deception to strongly convince others of something that you didn't end up doing. Note that this game is explicitly framed as a deception game. You can also cite the fact that by making this decision you contribute directly to charitable donations as a very legitimate reason for the choice.

Oh, another alternative: I could potentially use the proceeds to fund a much larger and more realistic version of this experiment. The original Newcomb's problem is $1k and $1m so that the amounts are actually meaningful, obviously this one is only using toy amounts and therefore the game plays very differently.

@jack Another nice attempt, but (A) the proceeds are much smaller than that and (B) I'm inclined to think that this won't make a significant difference and you seem motivated to do other variants either way, so that won't result in a significant enough delta
Also, you gaining an additional M$19k for this doesn't look worth of @Catnee and other people who bet on Yes loosing M$32k, even without taking my reputation into account )
I think that any of those 4 concerns (insufficient amount, baseline behavior in one-box world, Yes team loosing M$32k which they can spend in whatever way later, reputation) alone is likely sufficient to reject this attempt )

@NikitaSkovoroda With the proceeds I could run a M$20k experiment. Without the proceeds, I am only planning to run small-scale ones - the one I most recently created was only M$200 because that's about how much the unique traders bonuses have been covering.

Regarding the M$32k that Catnee bet - you have no obligation to the traders on this market (either the YES or the NO buyers). They simply made a prediction, the choice is yours.

Sorry, still unconvincing, i.e. I want to see the other branch of the future more )

@ms as I previously said in dm, I don't care, and I will just one-box )
I don't know ir blackmail is allowed in this market, but I won't participate in it
You are welcome to buy as many NO shares as you want, but you will just lose money here

Well, social consequences are allowed (e.g. people can be mad if Nikita doesn't one-box as promised), but blackmail that violates normal Manifold rules and norms is not allowed.

@jack I'm pretty sure @ms is talking about e.g. promising to bet a huge amount on NO at a close to market resolution time if I don't pay him mana, so that I get a close to 0 sum in box B (again, if I don't pay him).
This isn't going to work even if allowed though.

Making an agreement to pay someone else mana would not be allowed by the rules, for the same reason that accepting mana bribes is not allowed.
However, buying a bunch of NO to push the box contents down is permitted by the rules, and I could imagine ways to threaten to do that to gain some non-financial advantage.

@ms also a reminder: a correctly built agent will not respond in xor blackmail problem, and both you and I know that not responding is the optimal solution
Now, how convinced are you that I will not cooperate with a blackmailer?
Assign your probability to that and check – do you get a positive or negative amount of mana by blackmailing? ))
Once again: I will not cooperate in blackmail, so you are just going to loose mana and/or gift that mana to people who buy Yes

@ms Why don't you up this to 100% just before the market resolution time with your 60k of mana instead? You'll get free profit in no time



@StevenK Hm, but I know that that's not 5%.
But there is no way that would lead to you just trusting me on this, correct?

@StevenK just reminding you of this thread: https://manifold.markets/Courtney/isaacking-and-destiny-together-mani#uMXmgFdlnT6zbBkJwngp
Would that help you believe me when I say that I'll one-box?
I will do that regardless of your actions on this market, but I just wonder if perhaps you don't want to loose that now-not-too-much mana that you're betting on two-boxing )
(Also, I want to see the percentage higher, obviously) 😂

Just noting again that I'll still one-box, regardless of market resolution or any of the discussions here.
That didn't change.


While we are here, just a reminder to everyone to go and read and sequences^W books at https://lesswrong.com/bestoflesswrong


@NikitaSkovoroda just to clarify: this ofc doesn't mean that I'm going to two-box or anything, there are just not enough incentives for that, especially given all the other concerns mentioned here
I'll one-box, Yes is free money

The fact that people can make precommitments on these, putting their long term reputations on the line to affect the predictions is I think another way in which this varies from the traditional paradox where the decision is part of just one self contained event.

@Preen which clearly is worth more, than 1 mana.
There will be a separate anonymous variant though!

Yep, check it out here! https://manifold.markets/jack/will-the-responder-onebox-in-this-n-6b2e3bebaf7a


@Preen There is one more level that can affect the decision making here, but I'll tell about it later )

@NikitaSkovoroda Is it one of the ones discussed in the previous market, e.g. the fact that this is a repeated game?

@jack Hm, that's a good point. I don't believe that it's the same concern though. Might be related though?

@jack more like: i don't want people who put their trust in me to loose mana )
I tried some thought experiments, and I don't think that this requires the game (even in the broad sense) to be repetitive


@ms i'd likely be unconvinced and will consider this a prank^W mana extraction attempt


@NikitaSkovoroda @Catnee its me who’s thinking funny things here, feel free to bribe me
@NikitaSkovoroda I don’t believe you. Even if it’s a prank in 9999 out of 10000 cases, it’s all about dignity


@NikitaSkovoroda Like, imagine you know it’s actually Eliezer and your brain wouldn’t lie to you about that

@ms_test i was just clarifying which type of scenarios we are speaking of
For now, your scenario looks like something which can happen, given some conditions, and it likely won't convince me
I can come up with a more interesting scenario close to yours, but I won't give you ideas

@NikitaSkovoroda On another note, have you considered using a mixed strategy with randomness in your decision? Given the way markets get very expensive to move close to 100%, I suspect including a small amount of randomness in your decision could be positive EV. I also feel like there's a lot more room for interesting strategies than just one-boxing or two-boxing in this context.

@jack that was too complex at the beginning of this when I decided on the strategy)
And I don't want to change it

Also note that I predicted Yes on both https://manifold.markets/jack/will-the-responder-onebox-in-this-n (had to sell the position due to the rules) and https://manifold.markets/jack/will-the-responder-onebox-in-this-n.
I'm not going to elaborate what this means, just stating facts.

Second link should have been https://manifold.markets/Tetraspace/will-the-responder-onebox-in-this-t.




Updating to 99% on this statement seems too much. I am inclined to believe it, but do remember that this is a game of deception - it's in the player's interest to say they will one-box and then two-box, if they think they can get away with it.

In fact, I think the correct bayesian update on this statement compared to the previous prediction of 78% is very small.

@jack Heh, I was wondering when (i.e. after how much time) would someone notice that a two-boxing agent would likely claim the same as I did above.

































