How many parameters will GPT-4 have?
58
634
Ṁ33K2025
1D
1W
1M
ALL
0%
300
0.1%
350-400
0%
401-450
0%
451-500
0%
501-550
0%
551-600
0%
601-700
0%
701-800
0%
801-1000
0.2%
1001-1200
0.2%
1201-1400
0.6%
1401-1600
94%
>1600
3%
801-1600
0.4%
301-349 or <300
0.1%
1200B exactly
1%
1000B-1400B
0.1%
Yes
GPT-3 has a staggering 175 BILLION parameters
To put it into context
Hugging face's 176 Billion parameter model took 3.5 months on 384 top-of-the-line GPUs to train...
GPT-3 is also over 2 years old.
Nov 17, 1:56am: How many parameters with GPT-4 have? → How many parameters will GPT-4 have?
Get Ṁ200 play money
Related questions
Sort by:
@MayMeta GPT-4 recommends weighting by the reciprocal of the interval length.
i.e. The 800-wide entry should get a weight of 1/800 if the answer is in there. A 100-wide entry would get a weight of 1/100, so 8x more.
Related questions
What will be true about GPT-4.5?
How many parameters will GPT-4 have?
1.8T
Will GPT-4 be trained on more than 10T text tokens?
35% chance
Will GPT-4's parameter count be known by end of 2024?
42% chance
Is GPT-4 best? (Thru 2025)
52% chance
Will GPT-5 have fewer parameters than GPT-4? (1500M subsidy)
16% chance
Will GPT-5 have more than 10 trillion parameters?
29% chance
Will GPT-5 have over 100 trillion parameters?
22% chance
Will GPT-5 have over 1 trillion parameters?
87% chance
Will GPT-5 have over 10 trillion parameters?
74% chance