Will OpenAI train a 1 trillion parameter machine learning model by the end of 2023?
6
53Ṁ157resolved Apr 27
Resolved
N/A1H
6H
1D
1W
1M
ALL
This market resolves to Yes if by 31st of December 2023, if OpenAI has made public the fact that they have trained a 1 trillion parameter neural network model. Otherwise it results to No.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Sort by:
On the other hand, the results from Chinchilla (https://arxiv.org/abs/2203.15556) suggest that GPT3 and similar models were over-parameterized relative to what's compute optimal (and should have instead seen more data). On the new Chinchilla curves, a 1T parameter model would be two OOMs bigger than the current "best" model and require a lot of FLOPS. If someone's calculated how many FLOPS, a compute-optimal 1T param model would use, I'd be very interested to see that by the way.
People are also trading
Related questions
An AI model with 100 trillion parameters exists by the end of 2025?
20% chance
Will OpenAI models achieve ≥90% on SimpleBench by the end of 2025?
24% chance
Will OpenAI launch a model even more expensive than o1-pro in 2025?
28% chance
Will OpenAI fold in 2025?
2% chance
Will a single model achieve superhuman performance on all OpenAI gym environments by 2025?
25% chance
Will OpenAI become nothing by 2030?
AI: Will someone train a $1T model by 2030?
25% chance
AI: Will someone train a $1T model by 2050?
81% chance
100GW AI training run before 2031?
36% chance
AI: Will someone train a $1T model by 2080?
69% chance