How much FLOP will be used to train the best language model with freely available weights on July 1, 2025?
4
40
Ṁ1.1KṀ535
2025
1D
1W
1M
ALL
2%
<1e24
2%
[1e24, 3e24)
8%
[3e24, 1e25)
21%
[1e25, 3e25)
32%
[3e25, 1e26)
23%
[3e26, 1e27)
5%
[1e27, 3e27)
2%
[3e27, 1e28)
1.8%
[1e28, 3e28)
1.3%
[3e28, 1e29)
1.3%
>=1e29
Get Ṁ200 play money
Related questions
Related questions
Will a language model comparable to GPT-4 be trained, with ~1/10th the amount of energy it took train GPT-4, by 2028?
77% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
27% chance
How much FLOP will be used to train Llama3?
End of pre-training era for language models: Will an LM fine-tune for more FLOPs than it is pre-trained for, before 2026
22% chance
How big will Mistral's known largest language model be? (2024)
Will any language model trained without large number arithmetic be able to generalize to large number arithmetic by 2026?
69% chance
By 2025 will there be a competitive large language model with >50% of the total training data generated from a large language model?
75% chance
How many FLOPs will go into training the first ASL-3 model?
Will a machine learning training run exceed 10^25 FLOP in China before 2025?
49% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2027?
51% chance