Is there any point in having manifold-bucks post-superintelligence?
38
Never closes
yes, at least as much as now
less so
no, no point
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Sort by:
Also consider always betting against the arrival of ASI in markets about it, regardless of your true beliefs.
This seems fairly bad for people who aren't in on the metagame and take market predictions at face value. I don't see a way around this, besides mitigating it with disclaimers and discouraging doomsday markets (or other markets that have distorted betting incentives).
Related questions
Related questions
Will six months or fewer elapse between when Manifold declares the achievement of AGI and superintelligence?
41% chance
Will Manifold stop using AI to make my questions worse by the end of 2025?
48% chance
Will Bostrom's "Superintelligence" exceed its current popularity peak before 2028?
15% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2040?
75% chance
When Manifold's AGI countdown resolves YES, will Manifold users think that AGI really has been achieved?
46% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2100?
87% chance
Will AI wipe out Manifold by 2030?
4% chance
If Manifold deems AGI to have been achieved, will Manifold also agree that the singularity has occurred?
37% chance
There will be only one superintelligence for a sufficiently long period that it will become a singleton
64% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2050?
79% chance