How much compute will be used to train GPT-5?
17
1.5kṀ14042027
1.1%
Fewer than 5e25 FLOP
10%
Between 5e25 FLOP and 1.5e26 FLOP
28%
Between 1.5e26 FLOP and 4.5e26 FLOP
31%
Between 4.5e26 FLOP and 1.4e27 FLOP
22%
Between 1.4e27 FLOP and 4e27 FLOP
7%
More than 4e27 FLOP
This resolves as the bucket that contains the number of floating point operations (FLOP) used to train GPT-5
FLOP may be performed at any precision (INT8, FP16, etc.)
This resolves on the basis of the numbers reported by OpenAI, and, if they don't report this, on the basis of the first credible estimates reported in this database (details about the database may be found here)
If GPT-5 is not released by EOY, 2027, this resolves ambiguously
If the estimate is exactly on the edge of each range, we will resolve it as the largest of the two options
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
22
GPT-5 trained with >=24k GPUs?
82% chance
What hardware will GPT-5 be trained on?
What percentage of 2025 will be left when OpenAI announces GPT-5?
38% chance
What percentage of 2025 will be left when OpenAI releases GPT-5?
38% chance
Will a model be trained using at least as much compute as GPT-3 using AMD GPUs before Jan 1 2026?
84% chance
What will be true about GPT-5?
Will growth in the maximum MW used to train AIs, slow down by more than x2 after GPT-5-like?
55% chance
Will the ratio of inference runs to training runs on GPT5 decrease from the ratio on GPT4?
50% chance
What will GPT-5's context size be? (2025)