Will OpenAI train a 1 trillion parameter machine learning model by the end of 2023?
6
Ṁ52Ṁ157resolved Apr 27
Resolved
N/A1H
6H
1D
1W
1M
ALL
This market resolves to Yes if by 31st of December 2023, if OpenAI has made public the fact that they have trained a 1 trillion parameter neural network model. Otherwise it results to No.
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
Sort by:
On the other hand, the results from Chinchilla (https://arxiv.org/abs/2203.15556) suggest that GPT3 and similar models were over-parameterized relative to what's compute optimal (and should have instead seen more data). On the new Chinchilla curves, a 1T parameter model would be two OOMs bigger than the current "best" model and require a lot of FLOPS. If someone's calculated how many FLOPS, a compute-optimal 1T param model would use, I'd be very interested to see that by the way.
People are also trading
Related questions
Will OpenAI officially launch any new publicly named AI model before May 1, 2026?
73% chance
Will OpenAI release another open source LLM before end of 2026?
70% chance
Will OpenAI become nothing by 2030?
AI: Will someone train a $1T model by 2030?
25% chance
AI: Will someone train a $1T model by 2050?
81% chance
100GW AI training run before 2031?
33% chance
AI: Will someone train a $1T model by 2080?
69% chance
Will OpenAI disappear before 2034?
48% chance
AI: Will someone train a $1B model by 2028?
82% chance