Will GPT-4's max context window increase by the end of 2023?
76%
chance

Anthropic recently announced Claude-100k, a version of their Large Language Model Claude with an 100k token context window.

Will OpenAI follow suit by increasing the max context length of GPT-4 before the end of 2023 (Dec 31 2023, 11:59 PM GMT)?

"GPT-4" is defined as: Any product commonly referred to as "gpt-4" or similar by OpenAI. Names like "gpt-4-0314," "gpt-4-multimodal-2," "gpt-4," or "gpt-4-oct," would count as long as they are commonly referred to as GPT-4 and build off the original GPT-4 models. "gpt-4-plus," "gpt-4.5," or "gpt-4.1" would not count unless OpenAI regularly calls them GPT-4 and the consensus is that they are newer versions of gpt-4. It need not be accessible to the general public, but someone must be regularly selling it (so if they sell it to companies only, it would count)

Resolves YES if:
- Any version of GPT-4 has a max context of more than 32,769 text tokens AT ANY POINT within 2023.

Resolves NO if:
- The max context window of all versions of GPT-4 remains at 32,769 text tokens, or decreases.
- Giving GPT-4 more context is discussed in writing, or an experimental version with more context is benchmarked, but no product is released.
- Multimodal GPT-4 has a separate context length for images, but the max text length does not go above 32,769 tokens.

Resolves N/A if:
- A new version of GPT-4 is released, but we don't know what it's context length is by the end of 2023.
- It is unclear whether a model with a higher text token maximum than 32,769 is GPT-4 based on the above definition. (ex. GPT-4 gets leaked somehow, someone makes edits to it to increase the context length, Sam Altman calls it GPT-4 informally, but OpenAI doesn't call it GPT-4 officially)

If OpenAI changes their name, their new name will be valid in place of "OpenAI" written anywhere in the resolution criteria.

Get แน€500 play money

Related questions

Will GPT-4 per-token price decrease by the end of Q3'2023?
Will GPT-4 fine-tuning be available by October 1st?
getby avatarI get down
1% chance
Will any group develop a GPT-4 competitor of comparable capability (text) by Oct. 2023
hyperion avatarhyperion
3% chance
Will OpenAI release GPT-4 finetuning by Fall 2023?
Mira avatarMira ๐ŸŽ
63% chance
Will GPT 4.5 be announced by October? (2023)
Will OpenAI release ChatGPT 5 before June 2024?
Mahdi avatarMahdi
11% chance
Will GPT-5 be released before 2025?
VictorLJZ avatarVictor Li
50% chance
Will a large GPT-4 equivalent competitor model be revealed by the end of 2023?
Will GPT-4's parameter count be known by end of 2024?
Mira avatarMira ๐ŸŽ
41% chance
Will GPT-4's parameter count be announced by the end of 2023?
ada avatarada
21% chance
Will GPT-4 fine-tuning be available by EOY?
getby avatarI get down
66% chance
Will Microsoft release a speaker with a GPT-4 based home assistant in 2023?
EsbenKran avatarEsben Kran
14% chance
GPT4 or better model available for download by EOY 2024?
Will mechanistic interpretability be essentially solved for GPT-2 before 2030?
MatthewBarnett avatarMatthew Barnett
30% chance
Will we train GPT-4 to generate resolution criteria better than the creator 50% of the time by the end of 2023?
CrystalBallin avatarCrystal Ballin'
30% chance
GPT-5 by 2025?
Gigacasting avatarGigacasting
59% chance
When will GPT-5 be released? (2025)
Mira avatarMira ๐ŸŽ
41% chance
Will there be a GPT-4 Instruct model released in 2023?
Mira avatarMira ๐ŸŽ
32% chance
Will GPT-5 be announced in 2024?
LucasPrietoAl avatarLucas
67% chance
Sort by:
ampdot avatar
ampdotpredicts NO

GPT-4 has a max context window of 32,769 tokens, when the prompt + completion are added together. The OpenAI documentation clearly defines the context window as the sum of the token length of the prompt and completion. Per your resolution criteria (โ€œAT ANY POINT IN 2023โ€), resolve YES.

7 replies
Jason avatar
Jason

@ampdot Although the market clearly says "increase" -- is that an increase?

osmarks avatar
osmarks

@Jason > Any version of GPT-4 has a max context of more than 32,768 text tokens AT ANY POINT within 2023.

ShadowyZephyr avatar

@ampdot That's a typo. My bad.
Edit: Actually I think I was right initially.

Gen avatar
Genzypredicts YES

I hope zephyr invites you to kick rocks because this is obviously against the spirit of the market but props to you cause itโ€™s a funny point to lawyer on

Mira avatar
Mira ๐ŸŽpredicts YES

@ampdot I've seen that 32,769 figure before somewhere too, but the docs do say 32,768: Models - OpenAI API

I wonder if the last one is a special control token or something.

Also: I don't know about GPT-4, but transformer models have no intrinsic context window. But they might produce nonsense if you force it when it hasn't been trained on longer contexts, or you might not have enough memory on your particular GPU. So people usually limit it in the frontend. So the answer might be "infinite" depending on their implementation.

ShadowyZephyr avatar

@Mira That is true, there is no TECHNICAL limit, and people have experimented with increases before. I'm asking if a version with longer than 32k will be released AKA sold by someone.

I'm leaving it as greater than 32,769 for now, because I don't want to deal with this over 1 token, but I'm pretty sure it would count as 32,768 if OpenAI says so.

GiftedGummyBee avatar
Gifted Gummy Beebought แน€50 of NO

I do not think so. There simply isnt enough compute for current OAI techniques, and that they are focusing on MM and reducing inference costs. So I bet no.

2 replies
ICRainbow avatar
IC Rainbowpredicts YES

@GiftedGummyBee there's enough compute for Anthropic to pull that 100k model. Surely OpeanAI can do the same?

GiftedGummyBee avatar
Gifted Gummy Beepredicts NO

@ICRainbow Because anthropic is both using a different architecture and most likely way smaller model. From what I heard, sub 100b is what they have. They also used a bunch of tricks, which I doubt OAI didnโ€™t already pull for GPT-4-32k/3.5-turbo-16k

parhizj avatar
JRP

I've seen the titles of several papers about increasing context windows but how difficult is it to implement in practice?

parhizj avatar
JRP

@ShadowyZephyr They retired davinci I believe, so upgrades are to be expected I suppose?

parhizj avatar
JRPbought แน€10 of YES

@ShadowyZephyr

Quoting from this: https://platform.openai.com/docs/models/gpt-4 and this https://openai.com/blog/function-calling-and-other-api-updates

"On June 27th, 2023, gpt-4 will be updated to point from gpt-4-0314 to gpt-4-0613, the latest model iteration."

"gpt-4-32k-0613 includes the same improvements as gpt-4-0613, along with an extended context length for better comprehension of larger texts."

So it sounds like there will no longer be a wait list for the 32k context. If they took 3 months to iterate, so I am guessing maybe at least most 3 more months for a wait-list-testing model with a larger than 32k and another 3 months before public release? Based on the criteria "that it need not be accessible to the general public" I think there is a chance they will be motivated to release a private/testing version for companies before the end of 2023 (given the stiff competition and that they are retiring old models), so I am cautiously betting YES.

SneakySly avatar
SneakySly

I have related markets here: