
Will I post a sequence on AI alignment to LessWrong before July 1?
3
1kṀ33resolved Jul 1
Resolved
NO1H
6H
1D
1W
1M
ALL
I have a rough draft of part of the sequence ready and feel like it might be somewhat valuable, but I would like people to review it first, and I don't want to post it if I feel like it doesn't meet the epistemic standards of the site. I do not have any current plan to get reviewers or any connections in the community (although I am working on this), or any previous activity on the site.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
Will "Announcing ILIAD — Theoretical AI Alignment ..." make the top fifty posts in LessWrong's 2024 Annual Review?
9% chance
Will "How to replicate and extend our alignment fak..." make the top fifty posts in LessWrong's 2024 Annual Review?
14% chance
Will "Takes on "Alignment Faking in Large Language ..." make the top fifty posts in LessWrong's 2024 Annual Review?
19% chance
Will "Alignment Faking in Large Language Models" make the top fifty posts in LessWrong's 2024 Annual Review?
94% chance
Will "The Field of AI Alignment: A Postmortem, and ..." make the top fifty posts in LessWrong's 2024 Annual Review?
28% chance
Will "AIs Will Increasingly Attempt Shenanigans" make the top fifty posts in LessWrong's 2024 Annual Review?
12% chance
Will "2023 in AI predictions" make the top fifty posts in LessWrong's 2024 Annual Review?
13% chance
Will "You can, in fact, bamboozle an unaligned AI i..." make the top fifty posts in LessWrong's 2024 Annual Review?
14% chance
Will "Introducing Alignment Stress-Testing at Anthropic" make the top fifty posts in LessWrong's 2024 Annual Review?
10% chance
Will "Introducing AI Lab Watch" make the top fifty posts in LessWrong's 2024 Annual Review?
9% chance