Skip to main content
MANIFOLD
If AI has an okay outcome, was it because of humanity doing something beyond business-as-usual?
1
Ṁ100Ṁ73
2031
75%
chance

Resolves N/A if AI does not have an ok outcome.

Otherwise, resolves YES if

  • this OK outcome is because of an AI pause

  • OR the transhumanist future is achieve through non-AI technology

  • OR humans are enhanced as part of the process by which alignment is solved

  • OR there is a nonroutine effort for alignment, with AI being made by an organisation that makes alignment a top priority, treating it the primary function of the organisation and as a blocker on capabilities rather than something that can be done alongside capabilities.

Market context
Get
Ṁ1,000
to start trading!