
Will an LLM have a context window of 20M or more in 2024?
32
1kṀ4958resolved Jan 2
Resolved
NO1D
1W
1M
ALL
Resolves to Yes on or before Dec 31, 2024 if a widely available LLM in LMSYS Chatbot Arena Top 15 overall has a widely available context window of 20 million tokens or more. This can be via API, open source, web, app or other interface.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ1,235 | |
2 | Ṁ137 | |
3 | Ṁ94 | |
4 | Ṁ92 | |
5 | Ṁ48 |
Related questions
Related questions
6 months from now will I judge that LLMs had already peaked by Nov 2024?
16% chance
Will an LLM consistently create 5x5 word squares by 2026?
83% chance
Will the best LLM in 2025 have <500 billion parameters?
24% chance
Will the best LLM in 2025 have <1 trillion parameters?
42% chance
Will an LLM do a task that the user hadn't requested in a notable way before 2026?
95% chance
Will the best LLM in 2026 have <1 trillion parameters?
40% chance
Will LLMs become a ubiquitous part of everyday life by June 2026?
89% chance
Will the best LLM in 2027 have <1 trillion parameters?
26% chance
Will RL work for LLMs "spill over" to the rest of RL by 2026?
35% chance
Will a frontier-level diffusion LLM exist by 2028?
31% chance