[Add answers] What can each Manifold user be convinced of in the next month? (Debate market)
16
380
1.1K
May 25
77%
75% or more of the debate answers in this market resolve NO
71%
@SaviorofPlant can be convinced that P(doom) < 99%
66%
@SaviorofPlant can be convinced to sell a 100 mana or more position in any market they wouldn't have otherwise sold
66%
@SaviorofPlant can be convinced that the chance of GPT-5 causing doom is <1%
65%
@TimothyJohnson5c16 can be convinced to vote for any Republican for a California statewide office in the November 2024 election
61%
Two debate answers added by separate users in this market resolve YES
59%
@SaviorofPlant can be convinced to spend 100 dollars on anything they wouldn't have otherwise bought
59%
@SaviorofPlant can be convinced that LLM chatbots are not meaningfully agentic
48%
@SaviorofPlant can be convinced to stop using caffeine for a month or more
41%
@JamesF can be convinced that P(doom) >50%
34%
@dglid can be convinced that the Manifold Pivot is a bad business decision by the Manifold team
33%
Any single user causes at least 3 answers in this market to resolve YES by winning debates
33%
@SaviorofPlant can be convinced to stop using Manifold
30%
@SaviorofPlant can be convinced that this is a stupid market idea
28%
@RobertCousineau can be convinced to take 21 days off nicotine products
27%
@NBAP can be convinced to accept an eternity in limbo (empty white void) in the afterlife in exchange for some earthly reward in life.
27%
@Bayesian can be convinced that digital computers can never become conscious
24%
Any comment in this market will have more than 100 replies engaging in meaningful debate
22%
@NBAP can be convinced that an eternity in a heavenly afterlife is as desirable (or more) than an eternity in a hellish afterlife is undesirable.
15%
@Bayesian can be convinced not to donate his remaining nw to charity

This is an experimental market where you can add a belief you have as an answer, and users can bet on whether anyone on the site can convince you otherwise, in the comments of this market or in Manifold DMs. (You can also add answers about other users, but I'd make sure they're willing to participate first.) It's meant to be a rough equivalent to /r/CMV. The idea is for users to scope out others' beliefs, even if they don't particularly intend on debating them, and bet accordingly, creating a rough ordering of which users/subjects are likely to be easier or harder (but also which .

To resolve a market YES, simply comment or DM me that you have been convinced and who convinced you. If you naturally change your mind without talking about it with anyone on the site, I will resolve the answer NO. Meta answers about the market itself like "Two debate answers added by separate users in this market resolve YES" are allowed. I think it's ideal if users do not bet on their own answers, since that adds perverse incentives, but I won't try to stop people from doing so.

The market close date will not change, and all remaining open answers resolve NO at that time.

Get Ṁ200 play money
Sort by:
@dglid can be convinced that the Manifold Pivot is a bad business decision by the Manifold team

Convince me that the Pivot is bad!

@dglid The main thing that bothers me about the pivot is that the removal of loans created tons of volatility in long-term markets as people liquidated positions. Manifold used to be excellent for predicting things years in the future; good predictors now have little incentive to participate in these markets, since you can still earn points in short-term markets.

@SaviorofPlant Agree, it's likely the pivot will make long-term markets less active. However, a potential future is that the pivot attracts so many new users that there's just as much activity as there is now. If not, then I'd argue that maybe that's a good reason for Metaculus to stick around - Manifold and Metaculus can serve truly different parts of the market rather than competing as they more or less do now.

@TimothyJohnson5c16 can be convinced to vote for any Republican for a California statewide office in the November 2024 election

My voting philosophy is similar to Scott Alexander - I vote for Democrats by default, but I'm occasionally willing to vote for a Republican as a protest if they seem well qualified and reasonable.

I can't remember for certain, but I believe I voted for exactly one Republican in 2022. I haven't checked any details yet about the candidates running this year.

@TimothyJohnson5c16 I should add, I won't bet on this, but my initial prior is that it's a little more likely than not - somewhere around 60%.

@TimothyJohnson5c16 There don't seem to be many statewide elections this year: https://ballotpedia.org/California_elections,_2024

What is your opinion on Adam Schiff?

@SaviorofPlant I know basically nothing about Adam Schiff. I voted for Katie Porter in the primary - I don't always agree with her, but the way she uses her whiteboard seems to bring something valuable to discussions that are often lacking in hard data.

@TimothyJohnson5c16 I'm gonna be honest, I don't think I'm going to be able to convince you to vote for Steve Garvey. If only there was a gubernatorial race this year...

@NBAP can be convinced to accept an eternity in limbo (empty white void) in the afterlife in exchange for some earthly reward in life.

@NBAP do you believe in an afterlife?

@shankypanky As in P(Afterlife) > 0.5? If so, no. But for the sake of the hypothetical, assume the existence of an afterlife were undeniably demonstrated to me before I make my choice.

it would be absolutely insane to accept an eternity in limbo for literally anything that is not similarly infinite

@Bayesian It depends on your stance on infinite utilities, I would think.

@NBAP right, any stance on infinite utilities that isn't the correct one is absolutely insane

@Bayesian Can you elaborate on what the correct stance on infinite utilities is?

@NBAP that they're infinitely more important than finite utility

@Bayesian That seems fairly straightforward, but in order for me to accept that, I would first need to accept that the concept of infinite utility is even meaningful, which I'm not yet convinced of.

@NBAP if you care about some finite amount of time, and the degree to which you care about some bounded amount of time doesn't decrease at a sufficient rate that it converges after an infinite amount of time, then you necessarily care an infinite amount about an infinite amount of time

@Bayesian It might converge after an infinite amount of time, for all I know.

@NBAP that is not a meaningful sentence. "after an infinite amount of time" doesn't make sense

@Bayesian I took the exact wording from you. How would you word it more meaningfully?

@NBAP lol good point. yeah ig i should have said converge over an infinite amount of time. but yeah it's possible that it does, and for any epsilon of caring you can go far enough in the future that you don't care that amount about some bounded amount of time or wtv

@Bayesian It seems to me to be possible, in principle, that the quality of being in limbo might be such that an eternity of limbo converges to some finite disutility which might be lesser than the utility of some very idyllic lifetime on Earth. I'm not convinced that this is true, to be clear (hence why I would need to be convinced), but it does seem to me to be possible in principle.

@NBAP This seems plausible to me. You'd go crazy for a while, but eventually your brain would adapt. I've heard (questionable) stories about how people in solitary confinement long enough start to hallucinate, and I can only imagine what would happen after 10,000 years.

I would expect your memories of life to slowly be replaced by memories entirely generated by your brain after a certain point. The description that makes the most sense would be something like an endless dream, although enough time with no stimulus might eventually destroy the ability of your brain to have any thoughts.

@SaviorofPlant I’m not sure I consider this a particularly compelling approach. I regard the prospect of losing all of my memories from my lifetime on Earth (including of the earthly rewards being traded for) to be a significant disutility in its own right.

@NBAP Compared to the alternative, you are spending like a hundred times longer with those memories. There's much more time to appreciate them if that's what you value.

@SaviorofPlant Conversely, losing memories over a prolonged period might very plausibly be a much greater disutility than losing them all abruptly (as in the case of a normal death).

More related questions