
Is climate change or AGI a bigger existential threat to human?
39
Never closes
Climate Change
Artificial General Intelligence
Existential threat refers to a severe level of harm to society from complete destruction of our way of life to total extinction.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Sort by:
Related questions
Related questions
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
56% chance
If AGI causes human extinction before 2100, which type of misalignment will be the biggest cause?
What is the greatest existential threat (true extinction-level threat) to humanity?
Conditional on AGI taking over the world, what will it do to the humans?
By at least 10 years before human-level AGI is built, debate about AGI risk will be as mainstream as global warming is in 2015
20% chance
Are AI and its effects are the most important existential risk, given only public information available in 2021?
89% chance
Will AGI cause the collapse of civilization before the end of 2025? 📎
1% chance
Do you think that agi by 2030 is inevitable?
82% chance
Will AI cause an existential catastrophe (Bostrom or Ord definition) which doesn't result in human extinction?
25% chance
Will API pose an existential threat to humanity before 2050?
7% chance