
In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
15
1kṀ7912027
22
expected
1H
6H
1D
1W
1M
ALL
Resolves N/A if the number is not public/calculateable from public information, or if GPT-4 is not released by market close.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
How much compute will be used to train GPT-5?
GPT-5 trained with >=24k GPUs?
82% chance
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
83% chance
What hardware will GPT-5 be trained on?
Was GPT-4 trained in 4 months or less?
59% chance
GPT-4 performance and compute efficiency from a simple architecture before 2026
19% chance
How many parameters does GPT4o have?
Will there be an announcement of a model with a training compute of over 1e30 FLOPs by the end of 2025?
5% chance
Will an AI model use more than 1e28 FLOPS in training before 2026?
8% chance
Will a GPT-4 level system be trained for <$1mm by 2028?
89% chance