How many parameters does the new possibly-SOTA large language model, Claude 3 Opus, have?
➕
Plus
22
Ṁ916
2026
1.2%
Less than 250 billion
1.3%
Between 250 and 500 billion
13%
Between 500 billion and 1 trillion
49%
Between 1 trillion and 1.5 trillion
26%
Between 1.5 trillion and 2 trillion
10%
More than 2 trillion

Anthropic released their new Claude 3 model today, which apparently outperforms GPT-4, the previous state of the art LLM, on various benchmarks: https://www.anthropic.com/news/claude-3-family

GPT-4 is rumored to have 1.8 trillion parameters. How many does Claude 3 have?

Resolves N/A if there is no information on this by 2026. There does not need to be official confirmation of parameter count, just media reports which don't contradict each other.

Get
Ṁ1,000
and
S3.00
Sort by:

https://x.com/aidan_mclau/status/1849607168607002674

This random Twitter user claims to have heard that Claude Opus is smaller than GPT-4. Not sure how credible they are

Deleted comment

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules