How much should society prioritize AI safety research, relative to how much it is currently prioritized?
30
36
Never closes
Much less
Less
Meh - about the same
More
Much more
Get แน600 play money
Related questions
If AI safety is divided by left/right politics in the next 5 years, will the left be more pro-regulation than the right?
50% chance
Is AI Safety a grift?
33% chance
Will the US government require AI labs to run safety/alignment evals by 2025?
39% chance
When will AI be better than humans at AI research? (Basically AGI)
I am an AI safety researcher with a background in machine learning engineering and neuroscience. Will I personally be able to program and train an AGI for less than $10k by 2030?
23% chance
In 2025, will I believe that aligning automated AI research AI should be the focus of the alignment community?
48% chance
In 2050, will the general consensus among experts be that the concern over AI risk in the 2020s was justified?
79% chance
By end of 2028, will there be a global AI organization, responsible for AI safety and regulations?
40% chance
Is the nature of AI risk completely misunderstood today with respect to the state of the art in 2030?
64% chance
According to 20 AI safety experts, what is the most promising research direction in AI safety today?