
In 2025, what % of EA lists "AI risk" as their top cause?
6
130Ṁ139Dec 31
44%
chance
1D
1W
1M
ALL
Resolves to percentage, acoording to the latest publicly available community-wide survey.
In 2020, it was 14%:
https://forum.effectivealtruism.org/posts/83tEL2sHDTiWR6nwo/ea-survey-2020-cause-prioritization#Top_Cause_Percentages
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
55% chance
Will >90% of Elon re/tweets/replies on 19 December 2025 be about AI risk?
7% chance
In Jan 2027, Risks from Artificial Intelligence (or similar) will be on 80,000 hours top priority list
94% chance
In 2025, what percentage of EAs are non-male?
34% chance
Will >60% of EAs believe that "Pause AI" protests have been net positive in Q4 2025?
57% chance
How much will AI advances impact EA research effectiveness, by 2030?
At the beginning of 2026, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
66% chance
Will AI lead to an S-risk by 2100?
25% chance
Will >60% of EAs believe that "Pause AI" protests have been net positive in 2030?
36% chance
If AI doesn't destroy humanity, what proportion of future value (relative to 2023 EAs' CEV) will be attained?
76% chance