GPT-5 context window length >= 64k tokens?
35
428
690
Dec 31
90%
chance

If multiple versions released, like with GPT-4, then any of them count.

GPT-5 refers to the next big model that's a successor to GPT-4

Get Ṁ200 play money
Sort by:
bought Ṁ40 of YES

“Where we’re going we don’t need context windows”

bought Ṁ10 of YES

GPT-3 => GPT-4 was a 16x context length improvement, and context length is super valuable for holding up-to-date data outside the training data (like tools, systems prompts, etc). A 2x increase for the next generation seems like a safe bet.

bought Ṁ1 of YES

it will have /infinity context length