
If we find out in 2024, was o1's Transformer base trained on 10+x as much compute as GPT-4's?
10
1kṀ1080Jan 2
19%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
82% chance
Will there be evidence in 2025 that in April 2023, OpenAI had a GPT-4.5 or higher model?
16% chance
In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
22
Will a model be trained using at least as much compute as GPT-3 using AMD GPUs before Jan 1 2026?
84% chance
Will we have an open-source model that is equivalent GPT-4 by end of 2025?
82% chance
How much compute will be used to train GPT-5?
Will an open source model beat GPT-4 in 2024?
76% chance
Will a language model comparable to GPT-4 be trained, with ~1/10th the amount of energy it took train GPT-4, by 2028?
92% chance
Will $10,000 worth of AI hardware be able to train a GPT-3 equivalent model in under 1 hour, by EOY 2027?
16% chance
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
13% chance