
What will the inference cost of the best publicly available LM be in 2030?
6
560Ṁ3932031
7%
12%
.01-.1 docs
23%
.1-1 docs
26%
1-10 docs
12%
10-100 docs
13%
100-1K docs
2%
1K-10K docs
0.2%
10K-100K docs
4%
.001-.01 docs
Consider the best publicly available language model in 2030. For a single 2023 US dollar, how many 2K word documents can I each generate five words for?
I will pick a somewhat conservative estimate of the best inference cost I can achieve after working for three weeks with whatever funds I have at the time and without using publicly inaccessible resources.
Multimodal models that can operate on text count as LMs for the purpose of this question.
I will only accept answers that range over an OOM like so:
.1-1 docs, 1-10 docs, 10-100 docs
I may trade in this market.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
Will the best LLM in 2025 have <500 billion parameters?
24% chance
Will OpenAI inference costs fall by 100x over the next 18 months?
32% chance
Will LLM training costs fall 300x by 2028?
87% chance
Will the best LLM in 2025 have <1 trillion parameters?
42% chance
Will it cost $30 to train a GPT-3 level model in 2030?
19% chance
Will the best LLM in 2027 have <1 trillion parameters?
26% chance
Will we see a public GPU compute sharing pool for LLM model training or inference before 2026 ?
86% chance
Will the best LLM in 2027 have <250 billion parameters?
12% chance
[Carlini questions] Cost of a million output words in 2030 for an LLM that achieves at least current benchmark SOTA
0.2
Will the best LLM in 2026 have <1 trillion parameters?
40% chance