Will a language model comparable to GPT-4 be trained, with ~1/10th the amount of energy it took train GPT-4, by 2028?
24
1kṀ24062027
92%
chance
1H
6H
1D
1W
1M
ALL
Since the title has char limitations, here is the full question:
Will a model comparable to, or better than, GPT-4 be trained, with ~1/10th the amount of energy it took to train GPT-4, by 2028?
Many models will be kept secret, and their training details will prove hard to estimate. We will try our best to get an estimate. If its roughly within one OOM of required threshold, it'll count.
The question resolves in its spirit of whether low energy high efficiency models will be trained, than about whether it'll be 1/10th or 1/9th the energy.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
83% chance
Will a model be trained using at least as much compute as GPT-3 using AMD GPUs before Jan 1 2026?
84% chance
Will we have an open-source model that is equivalent GPT-4 by end of 2025?
82% chance
Before 2028, will anyone train a GPT-4-level model in a minute?
14% chance
Will a GPT-4 quality model be trained for under $10.000 by 2030?
78% chance
Will a language model that runs locally on a consumer cellphone beat GPT4 by EOY 2026?
84% chance
By January 2026, will we have a language model with similar performance to GPT-3.5 (i.e. ChatGPT as of Feb-23) that is small enough to run locally on the highest end iPhone available at the time?
93% chance
Will it be possible to disentangle most of the features learned by a model comparable to GPT-4 this decade?
37% chance
By January 2026, will a language model with similar performance to GPT-4 be able to run locally on the latest iPhone?
81% chance
Will a model as great as GPT-5 be available to the public in 2025?
84% chance