
If we find out in 2024, was o1's Transformer base trained on 10+x as much compute as GPT-4's?
10
Ṁ1kṀ1.1kresolved Jul 2
Resolved
N/A1H
6H
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
People are also trading
Related questions
In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
22
How much compute will be used to train GPT-5?
Will $10,000 worth of AI hardware be able to train a GPT-3 equivalent model in under 1 hour, by EOY 2027?
17% chance
Before 2028, will anyone train a GPT-4-level model in a minute?
29% chance
Will growth in the maximum MW used to train AIs, slow down by more than x2 after GPT-5-like?
55% chance
GPT-4 performance and compute efficiency from a simple architecture before 2026
19% chance
Will growth in the maximum MW used to train AIs, slow down by more than x2 after GPT-6-like?
60% chance
Will a GPT-4 quality model be trained for under $10.000 by 2030?
90% chance
Will OpenAI's autonomous agent be based on GPT-4?
34% chance