Will OpenAI train a 1 trillion parameter machine learning model by the end of 2023?
6
1
53
resolved Apr 27
Resolved
N/A
This market resolves to Yes if by 31st of December 2023, if OpenAI has made public the fact that they have trained a 1 trillion parameter neural network model. Otherwise it results to No.
Get Ṁ200 play money
Sort by:
On the other hand, the results from Chinchilla (https://arxiv.org/abs/2203.15556) suggest that GPT3 and similar models were over-parameterized relative to what's compute optimal (and should have instead seen more data). On the new Chinchilla curves, a 1T parameter model would be two OOMs bigger than the current "best" model and require a lot of FLOPS. If someone's calculated how many FLOPS, a compute-optimal 1T param model would use, I'd be very interested to see that by the way.
bought Ṁ20 of YES
For context: GPT-3 was 175B parameters, released in 2020; GPT-2 was 1.5B only a year ago. Extrapolating naively to GPT-4 would get to sth like 10T parameters (and overdue), but it's possible there's some focus on training smaller models ala RETRO?