Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
Basic
37
Ṁ1.8k2026
53%
chance
1D
1W
1M
ALL
The total power consumption could be estimated to be around 50-60 million kWh for training GPT-4.
1/10th of this energy = 5-6 million kWh
1/100th of this energy = 0.5-0.6 million kWh
See calculations below:
Related
Get Ṁ1,000 play money
Sort by:
If such a model is trained on synthetic data generated with a precursor model, does this take into account the energy used to train the precursor + run inference on it to produce the synthetic data?
Related questions
Related questions
Will a 15 billion parameter LLM match or outperform GPT4 in 2024?
24% chance
Will there be an LLM (as good as GPT-4) that was trained with 1/10th the energy consumed to train GPT-4, by 2026?
85% chance
Will an open-source LLM beat or match GPT-4 by the end of 2024?
81% chance
Will there be a OpenAI LLM known as GPT-4.5? by 2033
35% chance
Will there be an open source LLM as good as GPT4 by June 2024?
17% chance
Which next-gen frontier LLMs will be released before GPT-5? (2025)
Will xAI develop a more capable LLM than GPT-5 by 2026
54% chance
Will a language model comparable to GPT-4 be trained, with ~1/10th the amount of energy it took train GPT-4, by 2028?
83% chance
an LLM as capable as GPT-4 runs on one 4090 by March 2025
31% chance
an LLM as capable as GPT-4 runs on one 3090 by March 2025
30% chance