How will "Governance of superintelligence" by OpenAI cause Yudkowsky to update?
8
68
Ṁ113resolved Jun 22
1D
1W
1M
ALL
20%
Update towards higher probability that we all die (p(doom) increase)
18%
Update towards lower probability that we all die (p(doom) decrease)
61%
He will keep it a secret
Get Ṁ200 play money
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ36 | |
2 | Ṁ5 | |
3 | Ṁ1 |
More related questions
Related questions
Will OpenAI's undergo significant restructuring by 2025?
50% chance
Will Ilya Sutskever still lead OpenAI’s Superalignment team at the end of 2024? [ACX 2024]
40% chance
At the beginning of 2035, will Eliezer Yudkowsky still believe that AI doom is coming soon with high probability?
63% chance
Will Yudkowsky agree that his "death with dignity" post overstated the risk of extinction from AI, by end of 2029?
27% chance
Will this Yudkowsky tweet on AI babysitters hold up by Feb 2028?
41% chance
Will Ilya Sutskever leave OpenAI in 2024?
55% chance
Will Eliezer Yudkowsky have any meeting about AI with any member of the US congress, senate, or white house before 2025?
57% chance
Will OpenAI's structure change?
21% chance
Will the world's first superintelligence come from OpenAI? [M$300 liquidity subsidy]
33% chance
Will Eliezer Yudkowsky be employed at one of the top AI labs in 2023 or 2024?
14% chance