What will GPT-5's context size be? (2025)
92
6.1kṀ36k
2026
0%
0
0.1%
2k
2%
4k
0%
8k
0%
16k
0.2%
32k
1.3%
64k
12%
128k
20%
256k
11%
512k
34%
1024k
8%
2048k
11%
4096k

When OpenAI announces GPT-5, this market resolves to the largest context size measured in tokens that they announce support for.

GPT-3: 2048 tokens

GPT-3.5: 4096

GPT-4: 8k, 32k

GPT-5: ???

Anthropic's Claude announced a 100k variant, there are rumors of upcoming 1 million context size models, and surely OpenAI would want the most impressive-sounding model on release.

In the unexpected case they don't mention a specific context size or their architecture is changed so fixed context sizes no longer make sense, I'll wait until I have access and test its recall using very large documents.

If the largest context size isn't on this table, then this market resolves to a weighting of the surrounding entries. k is a multiplier of size 1024. GPT-4 would resolve "32k". Claude would resolve "log2(100k) = 16.61, so 2^16 = 64k would get weight 39% and 2^17 = 128k would get weight 61%".

Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.TermsPrivacy