At the beginning of 2028, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
Plus
12
Ṁ1752028
67%
chance
1D
1W
1M
ALL
If an intelligence explosion occurs, this market resolves N/A. Otherwise:
Shortly after market close, I will post a Yes/No poll in this market's comments, in the Manifold discord, and/or in whatever other appropriate Manifold-related spaces exist at that time. It will ask:
Do you believe that a rapid AI intelligence explosion poses a significant existential risk to humanity before 2075?
This market resolves to the percentage of Yes votes in the poll, rounded to the nearest integer.
The poll will be limited to one response per Manifold account, and the way everyone voted will be public.
All markets for each year:
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
At the beginning of 2025, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
64% chance
At the beginning of 2026, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
66% chance
At the beginning of 2027, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
69% chance
At the beginning of 2040, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
67% chance
At the beginning of 2029, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
77% chance
At the beginning of 2035, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
73% chance
At the beginning of 2030, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
73% chance
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
50% chance
An AI is trustworthy-ish on Manifold by 2030?
46% chance
In 2050, will the general consensus among experts be that the concern over AI risk in the 2020s was justified?
72% chance