Will an AI model use more than 1e28 FLOPS in training before 2026?
8
1kṀ3881Dec 31
10%
chance
1D
1W
1M
ALL
Resolution source: Epoch AI's list of notable AI models. I will check this source on January 1st, 2026, to see whether there is a model that uses more than 1e28 FLOPS https://epoch.ai/data/notable-ai-models
AI models do not only include LLMs, but also other types of AI models that are mentioned in the resolution source
As of market creation, the biggest LLM model is Grok 3, with 4.6e26 FLOPs of training compute
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Sort by:
People are also trading
Related questions
Will an AI achieve >85% performance on the FrontierMath benchmark before 2028?
76% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2026?
52% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2028?
44% chance
Will there be an announcement of a model with a training compute of over 1e30 FLOPs by the end of 2025?
5% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2028?
85% chance
How much FLOP will be used to train the best language model with freely available weights on July 1, 2025?
Will a machine learning training run exceed 10^26 FLOP in China before 2029?
82% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2027?
86% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2027?
82% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2025?
77% chance