
In 2025, what % of EA lists "AI risk" as their top cause?
11
130Ṁ274resolved Jan 1
Resolved as
35%11
1H
6H
1D
1W
1M
ALL
Resolves to percentage, acoording to the latest publicly available community-wide survey.
In 2020, it was 14%:
https://forum.effectivealtruism.org/posts/83tEL2sHDTiWR6nwo/ea-survey-2020-cause-prioritization#Top_Cause_Percentages
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
🏅 Top traders
| # | Name | Total profit |
|---|---|---|
| 1 | Ṁ6 | |
| 2 | Ṁ3 | |
| 3 | Ṁ2 |
Sort by:
People are also trading
Related questions
In Jan 2027, Risks from Artificial Intelligence (or similar) will be on 80,000 hours top priority list
94% chance
How much will AI advances impact EA research effectiveness, by 2030?
At the beginning of 2027, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
69% chance
Will AI lead to an S-risk by 2100?
25% chance
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
64% chance
Will >60% of EAs believe that "Pause AI" protests have been net positive in 2030?
33% chance
If AI doesn't destroy humanity, what proportion of future value (relative to 2023 EAs' CEV) will be attained?
76% chance
In 2050, will the general consensus among experts be that the concern over AI risk in the 2020s was justified?
72% chance
Will AI xrisk seem to be handled seriously by the end of 2026?
17% chance
Are AI and its effects are the most important existential risk, given only public information available in 2021?
89% chance