What is your degree of fear of being devoured by the Beast whose qualities are unspeakable?
➕
Plus
16
Ṁ193
Jun 2
29%
Devouring still quite likely, moderate terror
56%
Devouring now seems very unlikely, minimal fear of beast
16%Other

The Doctor

You, a perfectly rational agent, awake in a cell. You take stock of your surroundings, and find them bare, except for a small pamphlet next to you, entitled "Food for Thought". You open it and begin to read, it informs you that you have come to be as part of an insidious experiment run by "The Doctor", a rogue logician. The Doctor, whose motives you find impenetrable, will create in turn some group of N people, starting with one and doubling each time. Each group's fate, once created, will be determined by a complex mechanism tucked away deep in a mountain whose operation is unfathomable to you. However, the pamphlet assures you that the process is truly fair and random, no group will be favored over any other. The mechanism will output either LIFE or DOOM. The experiment ends once DOOM is output. Your fate, should the mechanism decide DOOM, will to be devoured by the Beast, whose description does not bear repeating. The Doctor reasons that since the group which receives the judgement of DOOM is always larger than the sum of all members of the groups that are spared, the Beast will not become enraged by a larger possibly more delicious cohort who remain un-devoured. You look around your cell, neither you nor it have any distinguishing features whatsoever. You consider that you are a perfectly rational agent, and try and decide exactly how worried you should be about your possible beastly consumption. After you fix your answer in your mind, a booming disembodied voice tells you that the mechanism has judged twelve trillion, four hundred and sixty-three billion, nine hundred and seven million, three hundred and four thousand, and five times before your upcoming judgement. Does your fear diminish?

End of Part One

Part Two

Get
Ṁ1,000
and
S3.00
Sort by:

So to be clear, this is like the snake eyes paradox, but the probability of choosing "devour" each turn is unknown, so the booming voice is meant to make us consider that it must be a very low number, right?

I'm afraid this doesn't work. To what degree do I trust the voice? To what degree do I trust the note?

Turn this into a simulated environment and we need to program the decision algorithm?

@Primer I can add whatever qualifiers you want but I believe all relevant information is delivered from a God's eye view. If the voice changes your answer I'm happy to specify that it was not lying, if that helps.

@Sailfish Can't get my head around the fact that a rational agent would think he was pranked or similar. The storytelling clashes with hypothetical thinking mode.

@Primer I actually don't think it matters, if you consider two rational agents with different priors, they still cannot just disagree on those priors, each should believe that his or her prior better reflects the actual state of the universe in some way. If I was asking you to bet, consider which agent with which priors would do better.

If you want to skip ahead and engage with the substance, I've already posted the answer here (I was somewhat impatient)

https://manifold.markets/dreev/is-the-probability-of-dying-in-anth#7tkxIkgI3Dgb4E6fcy0k

I promise it is sufficiently dry and lacking in artistic flair.

@Sailfish

I promise it is sufficiently dry and lacking in artistic flair.

Hehe, appreciated!

Ah, alas I would update that I'm in a thought experiment or simulation and therefore try to figure out if I can impact the next level of reality, which really makes these problems difficult to answer but maybe also decreases the chance that I find myself in them.

@MartinRandall Interesting, is there a reason you can't assign a conditional probability? For example, even if you are explicitly told you are being simulated, surely you can reason about how likely certain events in the simulation are. I somewhat explicitly left a jailbreak so I guess I am curious about what your argument for "Other" would be and how you would formalize it.

@Sailfish Well, for example, if I assign a conditional probability, does that make it more likely that The Doctor places me in the situation? If so it's better to refuse to reason unless the hypothetical is a net positive experience.

@MartinRandall I would not have guessed "Pre-commit to refusing to act as an agent so you are less likely to be simulated" as a possible answer, I guess I'll add it however you would like to phrase it. I might tender that it is net positive in the case where you have a very near zero chance of being eaten, and that reasoning about the simulation requires you to evaluate it in some way. For example, if you were to be delivered, say, a cupcake if you survived, you would need to evaluate the probability of survival to determine if it was net positive or not.

I think you can answer either question in whatever order, but I will say that I think Part Two is trivial and is mostly for people to register their beliefs.

@Sailfish Yeah, it's a pretty weird conclusion to come to but I think it holds up.

I left open the option to add answers which I'll do if people want.

Part Two

I forgot that I couldn't pin comments, oh well.

Related questions

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules