MANIFOLD
Will an AI model use more than 1e28 FLOPS in training before 2026?
10
Ṁ1kṀ6.5k
resolved Jan 1
Resolved
NO

Resolution source: Epoch AI's list of notable AI models. I will check this source on January 1st, 2026, to see whether there is a model that uses more than 1e28 FLOPS https://epoch.ai/data/notable-ai-models

AI models do not only include LLMs, but also other types of AI models that are mentioned in the resolution source

As of market creation, the biggest LLM model is Grok 3, with 4.6e26 FLOPs of training compute

Market context
Get
Ṁ1,000
to start trading!

🏅 Top traders

#TraderTotal profit
1Ṁ391
2Ṁ110
3Ṁ91
4Ṁ75
5Ṁ33
Sort by:

Don't see how this is possible. Stargate won't be fully underway until Mid 2026, and if they manage a partial training run with, say 200k GB200s that runs twice as long as Grok 3, that's looking at ~3.68e27 FLOPS. I think it's a similar-ish story for Project Rainier and even for Google.

@MingCat i can't bet bc I have edit access to this database but it's an easy no

bought Ṁ150 NO

@JoshYou whoah, cool to know, thanks!

© Manifold Markets, Inc.TermsPrivacy