Will an LLM with less than 10B parameters beat GPT4 by EOY 2025?
15
260Ṁ1812resolved Jan 10
Resolved
YES1H
6H
1D
1W
1M
ALL
How much juice is left in 10B parameters?
the original GPT4-0314 (ELO 1188)
judged by Lmsys Arena Leaderboard
current SoTA: Llama 3 8B instruct (1147)
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ47 | |
2 | Ṁ44 | |
3 | Ṁ32 | |
4 | Ṁ26 | |
5 | Ṁ23 |
People are also trading
Related questions
Will the best LLM in 2025 have <1 trillion parameters?
42% chance
Will the best LLM in 2025 have <500 billion parameters?
24% chance
Will the best LLM in 2026 have <1 trillion parameters?
40% chance
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
83% chance
Will the best LLM in 2027 have <1 trillion parameters?
26% chance
Will an open-source LLM under 10B parameters surpass Claude 3.5 Haiku by EOY 2025?
99% chance
Size of smallest open-source LLM marching GPT 3.5's performance in 2025? (GB)
1.83
Will the best LLM in 2026 have <500 billion parameters?
27% chance
China will make a LLM approximately as good or better than GPT4 before 2025
89% chance
Will the best LLM in 2027 have <500 billion parameters?
13% chance