
1GW AI training run before 2027?
17
1kṀ25232027
71%
chance
1H
6H
1D
1W
1M
ALL
In "Situational Awareness: The Decade Ahead", Leopold Aschenbrenner predicts that the largest AI training clusters will consume 1GW of electricity in ~2026.
This market resolves YES if a training run of a single AI model consumes 1GW+ of power sustained through most of the training run. This power cost includes overhead to run the data center, such as cooling.
This is one of a series of markets on claims made in Leopold Aschenbrenner's Situational Awareness report(s):
Other markets about Leopold's predictions:
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
People are also trading
10GW AI training run before 2029?
20% chance
100GW AI training run before 2031?
33% chance
$1T AI training cluster before 2031?
50% chance
xAI announces own ASIC/XPU for AI workloads by March 31, 2026?
32% chance
$100B AI training cluster before 2029?
88% chance
AI: Will someone train a $1T model by 2080?
69% chance
Will an international pause on large AI training runs be in effect on Jan 1, 2028?
5% chance
Will there be a global pause on the largest AI training runs at any point before AGI?
20% chance
AI: Will someone train a $1T model by 2050?
81% chance
Will $10,000 worth of AI hardware be able to train a GPT-3 equivalent model in under 1 hour, by EOY 2027?
16% chance
Sort by:
People are also trading
Related questions
10GW AI training run before 2029?
20% chance
100GW AI training run before 2031?
33% chance
$1T AI training cluster before 2031?
50% chance
xAI announces own ASIC/XPU for AI workloads by March 31, 2026?
32% chance
$100B AI training cluster before 2029?
88% chance
AI: Will someone train a $1T model by 2080?
69% chance
Will an international pause on large AI training runs be in effect on Jan 1, 2028?
5% chance
Will there be a global pause on the largest AI training runs at any point before AGI?
20% chance
AI: Will someone train a $1T model by 2050?
81% chance
Will $10,000 worth of AI hardware be able to train a GPT-3 equivalent model in under 1 hour, by EOY 2027?
16% chance
