In 5 years will I think the org Conjecture was net good for alignment?
23
1kṀ6152027
57%
chance
1H
6H
1D
1W
1M
ALL
Conjecture is an alignment firm founded by Conor Leahy, also founder(?) of Eleuther AI. They have officially short (5 year) timelines until doom, mostly work on interpretability, generation of new alignment research agendas via their work in Refine and SERI MATS, and convincing ML researchers and their labs that alignment is an important hard problem.
Some have expressed worry that their short timelines and high probability of doom will lead them toward advocating for more risky strategies, making us worse off in >5 year timeline worlds.
This market is, of course, conditional on Conjecture being wrong that we'll all die in 5 years. I'd like to know how much damage they do in that world.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
Will there exist a compelling demonstration of deceptive alignment by 2026?
70% chance
By 2028, will I think Conjecture has been net-good for the world?
75% chance
Will "Demystifying "Alignment" through a Comic" make the top fifty posts in LessWrong's 2024 Annual Review?
14% chance
Will I think that alignment is no longer "preparadigmatic" by the start of 2026?
18% chance
Will there be more alignmentforum posts from 2025 than 2024?
55% chance
Will >= 1 alignment researcher/paper cite "maximum diffusion reinforcement learning" as alignment-relevant in 2025?
19% chance
Will a major AI alignment office (eg Constellation/Lightcone/HAIST) give out free piksters to alignment ppl by EOY 2027?
43% chance
Will "Making a conservative case for alignment" make the top fifty posts in LessWrong's 2024 Annual Review?
13% chance
Will "What Is The Alignment Problem?" make the top fifty posts in LessWrong's 2025 Annual Review?
15% chance
Will we solve AI alignment by 2026?
1% chance