Will ""Carefully Bootstrapped Alignment" is organiz..." make the top fifty posts in LessWrong's 2023 Annual Review?
1
100Ṁ14resolved Feb 11
Resolved
NO1D
1W
1M
ALL
As part of LessWrong's Annual Review, the community nominates, writes reviews, and votes on the most valuable posts. Posts are reviewable once they have been up for at least 12 months, and the 2023 Review resolves in February 2025.
This market will resolve to 100% if the post "Carefully Bootstrapped Alignment" is organizationally hard is one of the top fifty posts of the 2023 Review, and 0% otherwise. The market was initialized to 14%.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
Will "Making a conservative case for alignment" make the top fifty posts in LessWrong's 2024 Annual Review?
13% chance
Will "How to replicate and extend our alignment fak..." make the top fifty posts in LessWrong's 2024 Annual Review?
14% chance
Will "Takes on "Alignment Faking in Large Language ..." make the top fifty posts in LessWrong's 2024 Annual Review?
19% chance
Will "Introducing Alignment Stress-Testing at Anthropic" make the top fifty posts in LessWrong's 2024 Annual Review?
10% chance
Will "What Is The Alignment Problem?" make the top fifty posts in LessWrong's 2025 Annual Review?
15% chance
Will "Without fundamental advances, misalignment an..." make the top fifty posts in LessWrong's 2024 Annual Review?
50% chance
Will "Key takeaways from our EA and alignment resea..." make the top fifty posts in LessWrong's 2024 Annual Review?
10% chance
Will "Alignment Faking in Large Language Models" make the top fifty posts in LessWrong's 2024 Annual Review?
94% chance
Will "LLMs for Alignment Research: a safety priority?" make the top fifty posts in LessWrong's 2024 Annual Review?
13% chance
Will "“Alignment Faking” frame is somewhat fake" make the top fifty posts in LessWrong's 2024 Annual Review?
19% chance