
How many parameters will GPT-4 have?
58
3.3kṀ33kresolved May 12
100%94%
>1600
0.0%Other
0.0%
300
0.1%
350-400
0.0%
401-450
0.0%
451-500
0.0%
501-550
0.0%
551-600
0.0%
601-700
0.0%
701-800
0.0%
801-1000
0.2%
1001-1200
0.2%
1201-1400
0.6%
1401-1600
3%
801-1600
0.4%
301-349 or <300
0.1%
1200B exactly
1.0%
1000B-1400B
0.1%
Yes
GPT-3 has a staggering 175 BILLION parameters
To put it into context
Hugging face's 176 Billion parameter model took 3.5 months on 384 top-of-the-line GPUs to train...
GPT-3 is also over 2 years old.
Nov 17, 1:56am: How many parameters with GPT-4 have? → How many parameters will GPT-4 have?
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ2,055 | |
2 | Ṁ1,939 | |
3 | Ṁ780 | |
4 | Ṁ768 | |
5 | Ṁ632 |
People are also trading
Will GPT-5 have fewer parameters than GPT-4? (1500M subsidy)
21% chance
What will be true about GPT-5?
What will GPT-5's context size be? (2025)
GPT-4 #5: Will GPT-4 be a dense model?
1% chance
By 2028, will there be a language model of less than 10B parameters that is superior to GPT-4?
84% chance
In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
22
Will GPT-4 be trained on more than 10T text tokens?
36% chance
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
13% chance
Will GPT-4 escape?
6% chance
What will be true about GPT-5? (See description)
Sort by:
@Lorenzo fair question! as this market - Will GPT-4 have over 1 trillion parameters? - resolved yes, >1600 is the only suitable option here
@MayMeta GPT-4 recommends weighting by the reciprocal of the interval length.
i.e. The 800-wide entry should get a weight of 1/800 if the answer is in there. A 100-wide entry would get a weight of 1/100, so 8x more.
People are also trading
Related questions
Will GPT-5 have fewer parameters than GPT-4? (1500M subsidy)
21% chance
What will be true about GPT-5?
What will GPT-5's context size be? (2025)
GPT-4 #5: Will GPT-4 be a dense model?
1% chance
By 2028, will there be a language model of less than 10B parameters that is superior to GPT-4?
84% chance
In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
22
Will GPT-4 be trained on more than 10T text tokens?
36% chance
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
13% chance
Will GPT-4 escape?
6% chance
What will be true about GPT-5? (See description)