69
345
Ṁ24KṀ1.3K
resolved Nov 7
Resolved
YES1D
1W
1M
ALL
According to the claim in this reddit thread: https://www.reddit.com/r/ChatGPT/comments/13xbd9n/1_million_tokens_context_window_is_coming_this/
"Longer context windows — Context windows as high as 1 million tokens are plausible in the near future."
How plausible is this?
This market resolves YES if a version of GPT4 with a 100k or greater token window is released this year.
Related market:
Get Ṁ200 play money
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ1,295 | |
2 | Ṁ126 | |
3 | Ṁ115 | |
4 | Ṁ103 | |
5 | Ṁ102 |
Related questions
Will GPT-4's parameter count be known by end of 2024?
59% chance
How many parameters will GPT-4 have?
1.7T
Will GPT-4 be trained on more than 10T text tokens?
35% chance
GPT-5 context window length >= 128k tokens?
91% chance
GPT-5 context window length >= 64k tokens?
90% chance
Will a language model comparable to GPT-4 be trained, with ~1/10th the amount of energy it took train GPT-4, by 2028?
77% chance
Will GPT-5 have more than 10 trillion parameters?
29% chance
How many parameters will GPT-4 have?
Will ~all Claude premium users have access to 1m token context window by June 1st?
20% chance
By 2028, will there be a language model of less than 10B parameters that is superior to GPT-4?
81% chance