
How much FLOP will be used to train the best language model with freely available weights on July 1, 2025?
5
Ṁ535Ṁ1.2kJul 2
2%
<1e24
2%
[1e24, 3e24)
12%
[3e24, 1e25)
15%
[1e25, 3e25)
33%
[3e25, 1e26)
25%
[3e26, 1e27)
4%
[1e27, 3e27)
1.9%
[3e27, 1e28)
1.6%
[1e28, 3e28)
1.2%
[3e28, 1e29)
1.2%
>=1e29
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
People are also trading
Related questions
End of pre-training era for language models: Will an LM fine-tune for more FLOPs than it is pre-trained for, before 2026
37% chance
Most training run compute greater than 2e27 FLOP by EOY 2026?
92% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2027?
82% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2028?
44% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2027?
86% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2028?
85% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2029?
82% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2030?
79% chance
Will the largest machine learning training run (in FLOP) as of the end of 2040 be in the United States?
46% chance
Will the largest machine learning training run (in FLOP) as of the end of 2035 be in the United States?
69% chance