When researching GPT-4 speculations, the estimated parameter count ranged anywhere from 175 billion (same as GPT-3) to over 100 trillion.
This market will resolve YES if GPT-4 has over 1 trillion parameters, otherwise it will resolve NO. The market close date will be indefinitely extended until an official figure for the total number of trainable parameters of the largest released model at initial release has been announced.
What if we simply made the common sense decision and resolved this to yes?
Creator has been AFK for a year now.
Jensen Huang clearly says in his keynote that it has roughly 1.8 trillion parameters.
> The latest, the state-of-the-art OpenAI model, is approximately 1.8 trillion parameters.
Calling it GPT-MoE-1.8T reoccurs multiple times throughout the presentation. Jensen knows what he's talking about, they are the ones providing the hardware for OpenAI.
Source
I think the only uncertainty here is exactly which model he is talking about, but I'd say it's pretty safe to resolve this.
@Shump How do you know he's not just putting whatever the best-known public number is, instead of revealing confidential information in a keynote? If you asked him for a source, he might point you to the Semianalysis article or the one from The Information.
He actually seems like a worse source than what we already have. I don't understand your guys' sudden enthusiasm.
@Mira He supplies the hardware to OpenAI. He is describing the actual system that OpenAI uses in the presentation (not with a lot of details, but it's not just a passing mention). He definitely knows the parameter count. He's not saying anything like "according to external sources" or whatever. He's just flat out saying that this is the approximate parameter count.
@Shump It seems like he's referring to the base GPT-4 (not Turbo). The presentation mentions a context length of 32k. Definitely not GPT-5 then, which almost certainly will have a context length at least as long as turbo.
Timestamp: 54:31
IMO that concludes it. If it's definitely GPT-4 this can resolve.
Another interesting tidbit from the video. GPT-4 used more than 10^10 PFLOPs in training.
@Shump I find it odd though that he is the one releasing essentially trade secrets of OpenAI. This leaves me with doubt. Does no one else have any doubts sprouting from that there is a financial incentive to make such a claim?
@parhizj I highly doubt he would reveal this information if OpenAI hasn't agreed to it in the first place.
@Shump I mean it's not really under doubt. But why do you trust Jensen Huang over OpenAI insiders that worked on GPT-4 that said "it has over 1 trillion parameters", or the person leading PyTorch, or [REDACTED]? Your incremental enthusiasm just seems wrong.
This would not have convinced me if I wasn't already convinced.
@Mira The CEO is NVIDIA is not just an employee, and information revealed in a keynote is not a leak. You can be pretty sure it's intentional.
I'm not sure what was revealed before, but this seems pretty authoritative to me.
I’m going to plunge into conspiracy land saying I assign more than 0% chance this is still disinformation (Financial motives ). The question was for official number which I took to be released by OpenAI but I suppose everyone thinks Huang is definitively alluding to OpenAI and not someone else and is also considered an “official” representative for OpenAI?
https://www.eetimes.com/sambanova-trains-trillion-parameter-model-to-take-on-gpt-4/
https://manifold.markets/EA42/will-gpt4-have-over-1-trillion-para#v7PaYMGhgBKV6Rp2XB54
screenshot from announcement: https://manifold.markets/andrew/how-many-parameters-will-gpt4-have#3h9n4rzhk6d (they just called it GPT-MoE-1.8T, which doesn't confirm that it's GPT-4)
Market description says to extend the close date until official number is announced. @MarcusAbramovitch any objections?
@Joshua there's an argument to resolve based on 1.76 trillion but I agree. Should extend