Is there any point in having manifold-bucks post-superintelligence?
36
Never closes
yes, at least as much as now
less so
no, no point
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
Also consider always betting against the arrival of ASI in markets about it, regardless of your true beliefs.
This seems fairly bad for people who aren't in on the metagame and take market predictions at face value. I don't see a way around this, besides mitigating it with disclaimers and discouraging doomsday markets (or other markets that have distorted betting incentives).
Related questions
Related questions
Conditional on no existential catastrophe, will there be a superintelligence by 2040?
71% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2030?
38% chance
When Manifold's AGI countdown resolves YES, will Manifold users think that AGI really has been achieved?
51% chance
Will six months or fewer elapse between when Manifold declares the achievement of AGI and superintelligence?
41% chance
Will Manifold be cited in a paper from a top-tier AI lab or AI journal before 2025?
11% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2100?
75% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2050?
79% chance
Is scale unnecessary for intelligence (<10B param super-human model before 2027)?
30% chance
If AI wipes out humanity, will mana still have value afterwards?
48% chance
An AI is trustworthy-ish on Manifold by 2030?
46% chance