Will anyone train a 50B parameter+ RetNet by the end of 2023?
11
250Ṁ1477resolved Jan 3
Resolved
NO1H
6H
1D
1W
1M
ALL
[RetNet paper](https://arxiv.org/abs/2307.08621). They claim some pretty cool
results in the small model range (up to 6.7B parameters). Will anyone attempt to generalize that to a large model?
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ100 | |
2 | Ṁ43 | |
3 | Ṁ37 | |
4 | Ṁ14 | |
5 | Ṁ9 |
People are also trading
Related questions
An AI model with 100 trillion parameters exists by the end of 2025?
22% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
15% chance
Will anyone train a TokenFormer model at scale before 2026?
25% chance
Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
32% chance
What will be the parameter count (in trillions) of the largest neural network by the end of 2030?
Will a model costing >$30M be intentionally trained to be more mechanistically interpretable by end of 2027? (see desc)
60% chance
1GW AI training run before 2027?
61% chance
100GW AI training run before 2031?
37% chance
AI: Will someone train a $1B model by 2028?
81% chance
AI: Will someone train a $1T model by 2030?
25% chance