Will an AI model use more than 1e28 FLOPS in training before 2026?
10
Ṁ1kṀ6.5kresolved Jan 1
Resolved
NO1H
6H
1D
1W
1M
ALL
Resolution source: Epoch AI's list of notable AI models. I will check this source on January 1st, 2026, to see whether there is a model that uses more than 1e28 FLOPS https://epoch.ai/data/notable-ai-models
AI models do not only include LLMs, but also other types of AI models that are mentioned in the resolution source
As of market creation, the biggest LLM model is Grok 3, with 4.6e26 FLOPs of training compute
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
🏅 Top traders
| # | Trader | Total profit |
|---|---|---|
| 1 | Ṁ391 | |
| 2 | Ṁ110 | |
| 3 | Ṁ91 | |
| 4 | Ṁ75 | |
| 5 | Ṁ33 |
Sort by:
People are also trading
Related questions
Will a machine learning training run exceed 10^27 FLOP in China before 2028?
44% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2028?
85% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2029?
82% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2027?
86% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2027?
82% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2030?
79% chance
Will a machine learning training run exceed 10^30 FLOP in China before 2035?
34% chance
Most training run compute greater than 2e27 FLOP by EOY 2026?
92% chance
Will models be able to do the work of an AI researcher/engineer before 2027?
25% chance
At least one of the most powerful neural nets at end of 2030 will be trained using 10^27 FLOPs
93% chance