How many parameters does the new possibly-SOTA large language model, Claude 3 Opus, have?
26
1kṀ1011Jan 2
1.2%
Less than 250 billion
5%
Between 250 and 500 billion
21%
Between 500 billion and 1 trillion
44%
Between 1 trillion and 1.5 trillion
21%
Between 1.5 trillion and 2 trillion
8%
More than 2 trillion
Anthropic released their new Claude 3 model today, which apparently outperforms GPT-4, the previous state of the art LLM, on various benchmarks: https://www.anthropic.com/news/claude-3-family
GPT-4 is rumored to have 1.8 trillion parameters. How many does Claude 3 have?
Resolves N/A if there is no information on this by 2026. There does not need to be official confirmation of parameter count, just media reports which don't contradict each other.
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
Sort by:
https://x.com/aidan_mclau/status/1849607168607002674
This random Twitter user claims to have heard that Claude Opus is smaller than GPT-4. Not sure how credible they are
Deleted comment