Open numeric answer for number of GPT-4 parameters. Market will resolve 100B if there's fewer than 100B, and 100T if there's more than 100T.
GPT-3 was 175 billion. There have been rumors that GPT-4 will be much bigger.
https://manifold.markets/MaxGhenis/will-gpt4-have-at-least-100-trillio (requires >100T, which is currently unlikely)
How does this market payout exactly?
@NoaNabeshima I don't know exactly how Manifold does these, but if you buy and the prediction goes up (or, resolves higher), you make M$. And vice versa.
So if you think the current level (410B) is high, then you predict lower. The more the market is mispriced, the more you'd win/lose.
@andrew I'm thinking that there's a ~10% probability that GPT-4 falls within a reference class (MoE) with ~40T parameters in expectation, which makes GPT-4's expected number of parameters at least 4T according to my probability distribution. But if Manfiold just pays out based on whether or not the amount is higher/lower than my estimate I shouldn't bet based on the expected number, I should bet based on my median estimate, which is kinda lame.
@NoaNabeshima Based on my P/L on this, it pays out based on how much you were right by (i.e., you can indeed trade based on expected value) — you get shared based on the current price, so the larger a mispricing, the more it'd pay out.
But perhaps someone from Manifold can clarify.
"GPT-4 won’t be the largest language model. Altman said it wouldn’t be much bigger than GPT-3. The model will be certainly big compared to previous generations of neural networks, but size won’t be its distinguishing feature. It’ll probably lie somewhere in between GPT-3 and Gopher (175B-280B)."