Will GPT-4 have over 1 trillion parameters?
610
9.3K
3.4K
resolved Mar 20
Resolved
YES

When researching GPT-4 speculations, the estimated parameter count ranged anywhere from 175 billion (same as GPT-3) to over 100 trillion.

This market will resolve YES if GPT-4 has over 1 trillion parameters, otherwise it will resolve NO. The market close date will be indefinitely extended until an official figure for the total number of trainable parameters of the largest released model at initial release has been announced.

Get Ṁ200 play money

🏅 Top traders

#NameTotal profit
1Ṁ96,417
2Ṁ28,283
3Ṁ6,123
4Ṁ5,120
5Ṁ3,799
Sort by:

summing MoE still rubs me the wrong way.

i may be biased by my great loss of Ṁ1, but still this market should have been N/Aed.

sold Ṁ4,311 YES

What if we simply made the common sense decision and resolved this to yes?

Creator has been AFK for a year now.

Ah I hadn't even seen that Shump had posted this in Mod Help when I posted this comment. I believe we are indeed going to resolve this to yes, as three mods agreed it should resolve.

@Joshua I posted a different market, but the source is the same.

Jensen Huang clearly says in his keynote that it has roughly 1.8 trillion parameters.
> The latest, the state-of-the-art OpenAI model, is approximately 1.8 trillion parameters.

Calling it GPT-MoE-1.8T reoccurs multiple times throughout the presentation. Jensen knows what he's talking about, they are the ones providing the hardware for OpenAI.
Source

I think the only uncertainty here is exactly which model he is talking about, but I'd say it's pretty safe to resolve this.

@Shump How do you know he's not just putting whatever the best-known public number is, instead of revealing confidential information in a keynote? If you asked him for a source, he might point you to the Semianalysis article or the one from The Information.

He actually seems like a worse source than what we already have. I don't understand your guys' sudden enthusiasm.

@Mira He supplies the hardware to OpenAI. He is describing the actual system that OpenAI uses in the presentation (not with a lot of details, but it's not just a passing mention). He definitely knows the parameter count. He's not saying anything like "according to external sources" or whatever. He's just flat out saying that this is the approximate parameter count.

@Shump It seems like he's referring to the base GPT-4 (not Turbo). The presentation mentions a context length of 32k. Definitely not GPT-5 then, which almost certainly will have a context length at least as long as turbo.

Timestamp: 54:31

IMO that concludes it. If it's definitely GPT-4 this can resolve.

Another interesting tidbit from the video. GPT-4 used more than 10^10 PFLOPs in training.

@Shump I find it odd though that he is the one releasing essentially trade secrets of OpenAI. This leaves me with doubt. Does no one else have any doubts sprouting from that there is a financial incentive to make such a claim?

@parhizj I highly doubt he would reveal this information if OpenAI hasn't agreed to it in the first place.

@Shump I mean it's not really under doubt. But why do you trust Jensen Huang over OpenAI insiders that worked on GPT-4 that said "it has over 1 trillion parameters", or the person leading PyTorch, or [REDACTED]? Your incremental enthusiasm just seems wrong.

This would not have convinced me if I wasn't already convinced.

@Mira The CEO is NVIDIA is not just an employee, and information revealed in a keynote is not a leak. You can be pretty sure it's intentional.

I'm not sure what was revealed before, but this seems pretty authoritative to me.

I’m going to plunge into conspiracy land saying I assign more than 0% chance this is still disinformation (Financial motives ). The question was for official number which I took to be released by OpenAI but I suppose everyone thinks Huang is definitively alluding to OpenAI and not someone else and is also considered an “official” representative for OpenAI?

https://www.eetimes.com/sambanova-trains-trillion-parameter-model-to-take-on-gpt-4/

https://manifold.markets/EA42/will-gpt4-have-over-1-trillion-para#v7PaYMGhgBKV6Rp2XB54

@Bill this is like a year old right

Nvidia has said it's 1.8

link?

@jacksonpolack Don’t have a link but it was in their presentation.

bought Ṁ100,000 YES

@M3465 You sound trustworthy. Okay, let's buy.

screenshot from announcement: https://manifold.markets/andrew/how-many-parameters-will-gpt4-have#3h9n4rzhk6d (they just called it GPT-MoE-1.8T, which doesn't confirm that it's GPT-4)

predicted YES

Market description says to extend the close date until official number is announced. @MarcusAbramovitch any objections?

predicted YES

@Joshua there's an argument to resolve based on 1.76 trillion but I agree. Should extend

boughtṀ1,000NO

@IsaacKing is on a @Mira type mission

bought Ṁ0 of NO

I have received private info that this will be resolving yes in short order. I'm interested in loans to push this market up or I'd be willing to sell my stake at above 94% due to my liquidity needs

🦜

predicted YES

More related questions