
How much FLOP will be used to train the best language model with freely available weights on July 1, 2025?
5
535Ṁ1237Jul 2
2%
<1e24
2%
[1e24, 3e24)
12%
[3e24, 1e25)
15%
[1e25, 3e25)
33%
[3e25, 1e26)
25%
[3e26, 1e27)
4%
[1e27, 3e27)
1.9%
[3e27, 1e28)
1.6%
[1e28, 3e28)
1.2%
[3e28, 1e29)
1.2%
>=1e29
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
Will an AI model use more than 1e28 FLOPS in training before 2026?
8% chance
End of pre-training era for language models: Will an LM fine-tune for more FLOPs than it is pre-trained for, before 2026
59% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2026?
52% chance
Will the largest machine learning training run (in FLOP) as of the end of 2025 be in the United States?
86% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
15% chance
Will there be an announcement of a model with a training compute of over 1e30 FLOPs by the end of 2025?
5% chance
Will a flagship (>60T training bytes) open-weights LLM from Meta which doesn't use a tokenizer be released in 2025?
20% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2027?
82% chance
Most training run compute greater than 2e27 FLOP by EOY 2026?
91% chance
Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
33% chance