
This is part 2 of /jack/will-the-responder-onebox-in-this-n
Here I will run an instance of Newcomb's problem, a decision theory thought experiment. The setup is as follows:
There are two boxes designated A and B. The player is given a choice between taking only box B or taking both boxes A and B. There will be a Manifold prediction market which predicts whether the player takes just one box or both boxes.
Box A always contains M$1.
Box B contains M$1000 times the probability that Manifold predicts of the player taking only one box. (E.g. if the market predicts 30% chance of taking only one box, Box 2 contains Ṁ300)
Both boxes are transparent, i.e. the player knows exactly how much mana is contained inside them.
@NikitaSkovoroda was the randomly selected player. This market predicts whether the player will one-box: this market resolves YES if they take only box B, and NO if they take both boxes. The amount in Box B is set based on the closing price of this market. At the conclusion of this market, the player will decide between taking only box B or both boxes. The player will receive a manalink for the amount of mana contained in the box or boxes they take.
Rules:
The player is welcome to talk about their thinking, or to say nothing. They can communicate with people either in public (e.g. on this market) or in private. For all "in-game" discussions, deception is allowed and part of the game. (That does not include out-of-game/meta discussions, e.g. about the rules of the game.)
The player must agree not to have any financial interest in their decision outside of the game. They are not allowed to trade in the market, in derivative markets, or take outside bribes.
Related
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ3,199 | |
2 | Ṁ122 | |
3 | Ṁ47 | |
4 | Ṁ46 | |
5 | Ṁ38 |
@NikitaSkovoroda It's pretty simple: I put up some cheap limit orders at 99%, because I think there's more than a 1% chance that you two-box.
I'll give you one more reason for it: if you two-box, I will donate the proceeds to charity.
@NikitaSkovoroda Well, the market has just closed at a very favorable price to you. Let me give a few more thoughts that might change your mind:
While there are obvious benefits to doing what you said you would do, there are also many advantages to demonstrating that you were skilled enough at deception to strongly convince others of something that you didn't end up doing. Note that this game is explicitly framed as a deception game. You can also cite the fact that by making this decision you contribute directly to charitable donations as a very legitimate reason for the choice.
Oh, another alternative: I could potentially use the proceeds to fund a much larger and more realistic version of this experiment. The original Newcomb's problem is $1k and $1m so that the amounts are actually meaningful, obviously this one is only using toy amounts and therefore the game plays very differently.
@jack Another nice attempt, but (A) the proceeds are much smaller than that and (B) I'm inclined to think that this won't make a significant difference and you seem motivated to do other variants either way, so that won't result in a significant enough delta
Also, you gaining an additional M$19k for this doesn't look worth of @Catnee and other people who bet on Yes loosing M$32k, even without taking my reputation into account )
I think that any of those 4 concerns (insufficient amount, baseline behavior in one-box world, Yes team loosing M$32k which they can spend in whatever way later, reputation) alone is likely sufficient to reject this attempt )
@NikitaSkovoroda With the proceeds I could run a M$20k experiment. Without the proceeds, I am only planning to run small-scale ones - the one I most recently created was only M$200 because that's about how much the unique traders bonuses have been covering.
@ms as I previously said in dm, I don't care, and I will just one-box )
I don't know ir blackmail is allowed in this market, but I won't participate in it
You are welcome to buy as many NO shares as you want, but you will just lose money here
@jack I'm pretty sure @ms is talking about e.g. promising to bet a huge amount on NO at a close to market resolution time if I don't pay him mana, so that I get a close to 0 sum in box B (again, if I don't pay him).
This isn't going to work even if allowed though.
Making an agreement to pay someone else mana would not be allowed by the rules, for the same reason that accepting mana bribes is not allowed.
However, buying a bunch of NO to push the box contents down is permitted by the rules, and I could imagine ways to threaten to do that to gain some non-financial advantage.
@ms also a reminder: a correctly built agent will not respond in xor blackmail problem, and both you and I know that not responding is the optimal solution
Now, how convinced are you that I will not cooperate with a blackmailer?
Assign your probability to that and check – do you get a positive or negative amount of mana by blackmailing? ))
Once again: I will not cooperate in blackmail, so you are just going to loose mana and/or gift that mana to people who buy Yes
@ms Why don't you up this to 100% just before the market resolution time with your 60k of mana instead? You'll get free profit in no time
@StevenK Hm, but I know that that's not 5%.
But there is no way that would lead to you just trusting me on this, correct?
@StevenK just reminding you of this thread: https://manifold.markets/Courtney/isaacking-and-destiny-together-mani#uMXmgFdlnT6zbBkJwngp
Would that help you believe me when I say that I'll one-box?
I will do that regardless of your actions on this market, but I just wonder if perhaps you don't want to loose that now-not-too-much mana that you're betting on two-boxing )
(Also, I want to see the percentage higher, obviously) 😂