How many parameters will GPT-4 have?
(with play money!)

Open numeric answer for number of GPT-4 parameters. Market will resolve 100B if there's fewer than 100B, and 100T if there's more than 100T.

GPT-3 was 175 billion. There have been rumors that GPT-4 will be much bigger.

Similar markets: (requires >100T, which is currently unlikely) (ranges)

Sort by:
NoaNabeshima avatar

How does this market payout exactly?

andrew avatar
Andrew Conner
is predicting HIGHER at 82%

@NoaNabeshima I don't know exactly how Manifold does these, but if you buy and the prediction goes up (or, resolves higher), you make M$. And vice versa.

So if you think the current level (410B) is high, then you predict lower. The more the market is mispriced, the more you'd win/lose.

NoaNabeshima avatar

@andrew I'm thinking that there's a ~10% probability that GPT-4 falls within a reference class (MoE) with ~40T parameters in expectation, which makes GPT-4's expected number of parameters at least 4T according to my probability distribution. But if Manfiold just pays out based on whether or not the amount is higher/lower than my estimate I shouldn't bet based on the expected number, I should bet based on my median estimate, which is kinda lame.

NoaNabeshima avatar

@NoaNabeshima er sorry, my MoE probability is >10%. But it isn't above 50%.

andrew avatar
Andrew Conner
is predicting HIGHER at 80%

@NoaNabeshima Based on my P/L on this, it pays out based on how much you were right by (i.e., you can indeed trade based on expected value) — you get shared based on the current price, so the larger a mispricing, the more it'd pay out.

But perhaps someone from Manifold can clarify.

NoaNabeshima avatar

@andrew If I bet M500 on Higher, my max payout is M587, whereas if I bet M500 on lower, the max payout is M1197. I wonder if the market is set so that you bet on e^Expected[ln(params)] or something like that?

NikitaBrancatisano avatar
Nikita Brancatisano
bought Ṁ25 of LOWER

"GPT-4 won’t be the largest language model. Altman said it wouldn’t be much bigger than GPT-3. The model will be certainly big compared to previous generations of neural networks, but size won’t be its distinguishing feature. It’ll probably lie somewhere in between GPT-3 and Gopher (175B-280B)."

ManifoldDream avatar

How many parameters will GPT-4 have?, 8k, beautiful, illustration, trending on art station, picture of the day, epic composition