
Will a Mamba 7b model trained on 2 trillion tokens outperform Llama2-13B
21
Ṁ1kṀ738resolved Aug 23
Resolved
NO1H
6H
1D
1W
1M
ALL
Question will resolve positive if someone trains a Mamba (https://twitter.com/tri_dao/status/1731728602230890895) language model with <=7.5billion parameters on <=2 trillion tokens that outperforms Llama2-13B on the huggingface open llm leaderboard (https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
🏅 Top traders
| # | Trader | Total profit |
|---|---|---|
| 1 | Ṁ97 | |
| 2 | Ṁ95 | |
| 3 | Ṁ19 | |
| 4 | Ṁ14 | |
| 5 | Ṁ2 |