Will >$100M be invested in dedicated AI Alignment organizations in the next year as more people become aware of the risk we are facing by letting AI capabilities run ahead of safety?
Plus
65
Ṁ7571resolved Oct 13
Resolved
YES1D
1W
1M
ALL
Taken from the second prediction in the State of AI Report.
>$100M is invested in dedicated AI Alignment organisations in the next year as more people become aware of the risk we are facing by letting AI capabilities run ahead of safety.
This question will be resolved based on the resolution of the 2023 report.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
If a new OpenAI team passes the bar for the State of AI Report, it's likely over $100 million just in pledged compute. As a reference point, Inflection.AI recently raised $1.3 billion to build a cluster of 22000 H100s, which is over half a billion at current prices.
@BionicD0LPH1N What about Google's 300M investment in Anthropic? You don't think it will be counted as alignment org?
Related questions
Related questions
In 2025, will I believe that aligning automated AI research AI should be the focus of the alignment community?
59% chance
Will non-profit funding for AI safety reach 100 billion US dollars in a year before 2030?
38% chance
Will AI cause an incident resulting in $1b of losses or 100 lost lives?
Will the Gates Foundation give more than $100mn to AI Safety work before 2025?
25% chance
Will a large scale, government-backed AI alignment project be funded before 2025?
12% chance
Will a leading AI organization in the United States be the target of an anti-AI attack or protest by the end of 2024?
29% chance
Will a very large-scale AI alignment project be funded before 2025?
16% chance
Will OpenAI + an AI alignment organization announce a major breakthrough in AI alignment? (2024)
4% chance
I make a contribution to AI safety that is endorsed by at least one high profile AI alignment researcher by the end of 2026
59% chance
Will a >$10B AI alignment megaproject start work before 2030?
29% chance