(LW Interesting Disagreements) Prosaic Alignment is currently more important to work on than Agent Foundations work.
10
Never closes
Yes
No
Unsure / Results
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Sort by:
People are also trading
Related questions
Will Inner or Outer AI alignment be considered "mostly solved" first?
In 5 years will I think the org Conjecture was net good for alignment?
57% chance
How difficult will Anthropic say the AI alignment problem is?
Will ARC's Heuristic Arguments research substantially advance AI alignment before 2027?
26% chance
Will the 1st AGI solve AI Alignment and build an ASI which is aligned with its goals?
17% chance