
How much FLOP will be used to train the best language model with freely available weights on July 1, 2025?
5
535Ṁ1237Jul 2
2%
<1e24
2%
[1e24, 3e24)
12%
[3e24, 1e25)
15%
[1e25, 3e25)
33%
[3e25, 1e26)
25%
[3e26, 1e27)
4%
[1e27, 3e27)
1.9%
[3e27, 1e28)
1.6%
[1e28, 3e28)
1.2%
[3e28, 1e29)
1.2%
>=1e29
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
At least one of the most powerful neural nets at end of 2030 will be trained using 10^26 FLOPs
96% chance
Will an AI model use more than 1e28 FLOPS in training before 2026?
24% chance
At least one of the most powerful neural nets at end of 2026 will be trained using 10^26 FLOPs
96% chance
OpenAI to release model weights by EOY?
88% chance
Will the largest machine learning training run (in FLOP) as of the end of 2025 be in the United States?
89% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
15% chance
At least one of the most powerful neural nets at end of 2026 will be trained using 10^27 FLOPs
82% chance
End of pre-training era for language models: Will an LM fine-tune for more FLOPs than it is pre-trained for, before 2026
44% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2026?
52% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2027?
82% chance