Will there be a version of GPT4 with a context window of 1 million tokens this year?
29
382
Ṁ12KṀ590
resolved Jan 1
Resolved
NO1D
1W
1M
ALL
According to the claim in this reddit thread: https://www.reddit.com/r/ChatGPT/comments/13xbd9n/1_million_tokens_context_window_is_coming_this/
"Longer context windows — Context windows as high as 1 million tokens are plausible in the near future."
How plausible is this?
This market resolves YES if a version of GPT4 with a 1 million or greater token window is released this year.
Get Ṁ200 play money
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ128 | |
2 | Ṁ73 | |
3 | Ṁ68 | |
4 | Ṁ48 | |
5 | Ṁ31 |
Sort by:
magic.dev announces a 5million token context window: https://magic.dev/blog/ltm-1
@EvanConrad3b7e Is the model actually any good though? This doesn't mean that much if it's on a small model, becuse that is a lot easier to run, OpenAI will have more problems running gpt-4 at 1M.
More related questions
Related questions
GPT-5 context window length >= 64k tokens?
90% chance
Will GPT-4 be trained on more than 10T text tokens?
35% chance
GPT-5 context window length >= 128k tokens?
91% chance
How many parameters will GPT-4 have?
Will anyone be reported to have made ≥1 million USD in 2024 through usage of their custom GPT?
35% chance
Will ~all Claude premium users have access to 1m token context window by June 1st?
20% chance
[Metaculus] Will an LLM at least on the scale of GPT-4 be widely available for download before January 1st, 2025?
89% chance
Will GPT-4's parameter count be known by end of 2024?
59% chance
Will a language model comparable to GPT-4 be trained, with ~1/10th the amount of energy it took train GPT-4, by 2028?
77% chance
Will any LLM have a context window of at least 1 million characters by the end of 2028?
93% chance