Have you signed the "Statement on Superintelligence"
15
Never closes
Yes
Not yet but support it
No and don't support it
Lizardman
See results
Context
Youtube video:
Link to the statement:
https://superintelligence-statement.org/
We call for a prohibition on the development of superintelligence, not lifted before there is
broad scientific consensus that it will be done safely and controllably, and
strong public buy-in.
See also
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
People are also trading
Related questions
Conditional on no existential catastrophe, will there be a superintelligence by 2040?
68% chance
What organization will first create superintelligence?
Will OpenAI publicly state that they know how to safely align a superintelligence before 2030?
21% chance
In which year will a majority of AI researchers concur that a superintelligent, fairly general AI has been realized?
Conditional on no existential catastrophe, will there be a superintelligence by 2030?
26% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2050?
72% chance
If there exists a super-intelligent AI, would majority of AI researchers answer Yes to "Have we reached AGI?" ?
67% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2075?
85% chance
Thinking Machines Lab releases product before Safe Superintelligence?
93% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2100?
86% chance