Will "LLMs for Alignment Research: a safety priority?" make the top fifty posts in LessWrong's 2024 Annual Review?
Basic
1
Ṁ102026
13%
chance
1D
1W
1M
ALL
As part of LessWrong's Annual Review, the community nominates, writes reviews, and votes on the most valuable posts. Posts are reviewable once they have been up for at least 12 months, and the 2024 Review resolves in February 2026.
This market will resolve to 100% if the post LLMs for Alignment Research: a safety priority? is one of the top fifty posts of the 2024 Review, and 0% otherwise. The market was initialized to 14%.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
Will "Alignment Implications of LLM Successes: a De..." make the top fifty posts in LessWrong's 2023 Annual Review?
37% chance
Will "Shallow review of live agendas in alignment &..." make the top fifty posts in LessWrong's 2023 Annual Review?
61% chance
Will "Against LLM Reductionism" make the top fifty posts in LessWrong's 2023 Annual Review?
21% chance
Will "Without fundamental advances, misalignment an..." make the top fifty posts in LessWrong's 2024 Annual Review?
46% chance
Will "Model Organisms of Misalignment: The Case for..." make the top fifty posts in LessWrong's 2023 Annual Review?
79% chance
Will "Refusal in LLMs is mediated by a single direction" make the top fifty posts in LessWrong's 2024 Annual Review?
29% chance
Will "Agentized LLMs will change the alignment land..." make the top fifty posts in LessWrong's 2023 Annual Review?
10% chance
Will "A Case for the Least Forgiving Take On Alignment" make the top fifty posts in LessWrong's 2023 Annual Review?
11% chance
Will "Tips for Empirical Alignment Research" make the top fifty posts in LessWrong's 2024 Annual Review?
24% chance
Will "LLMs Sometimes Generate Purely Negatively-Rei..." make the top fifty posts in LessWrong's 2023 Annual Review?
14% chance