
At the end of 2023, will I believe that a rapid intelligence explosion is a plausible result of AI capabilities research, and the possibility is worth spending some non-negligible amount of effort investigating?
36
670Ṁ2539resolved Jan 3
Resolved
YES1H
6H
1D
1W
1M
ALL
I can't see any reason why it wouldn't be, but I also haven't looked into it that deeply. I know there are a lot of smart people who think it isn't possible, so it seems likely there's something they understand that I don't.
Resolves N/A if an intelligence explosion occurs before market close.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ60 | |
2 | Ṁ27 | |
3 | Ṁ20 | |
4 | Ṁ10 | |
5 | Ṁ10 |
People are also trading
Related questions
In 2050, will the general consensus among experts be that the concern over AI risk in the 2020s was justified?
75% chance
At the beginning of 2026, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
71% chance
Will someone commit terrorism against an AI lab by the end of 2025 for AI-safety related reasons?
14% chance
Will I think the "AI has a data bottleneck" people are dumb before the end of 2025?
58% chance
Will AI be smarter than any one human probably around the end of 2025?
16% chance
Will some piece of AI capabilities research done in 2023 or after be net-positive for AI alignment research?
81% chance
Will an AI be solely responsible for an AI breakthrough by the end of 2030?
76% chance
Will AI surpass humans in conducting scientific research by 2030?
40% chance
Will there be an intelligence explosion before 2035?
69% chance
At the beginning of 2028, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
68% chance