Will there be an LLM (as good as GPT-4) that was trained with 1/10th the energy consumed to train GPT-4, by 2026?
21
169
Ṁ1.2kṀ1.4k
2026
84%
chance
1D
1W
1M
ALL
The total power consumption could be estimated to be around 50-60 million kWh for training GPT-4.
1/10th of this energy = 5-6 million kWh
1/100th of this energy = 0.5-0.6 million kWh
See calculations below:
Get Ṁ600 play money
Sort by:
Related questions
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
48% chance
Will there be an open source LLM as good as GPT4 by June 2024?
27% chance
When will an open-source LLM be released with a better performance than GPT-4?
There will be an open source LLM approximately as good or better than GPT4 before 2025
74% chance
Will there be a OpenAI LLM known as GPT-4.5? by 2033
25% chance
Will there be an open source LLM as good as GPT4 by the end of 2024?
69% chance
Will a model be trained using at least as much compute as GPT-3 using AMD GPUs before Jan 1 2026?
71% chance
Will an open-source LLM beat or match GPT-4 by the end of 2024?
66% chance
an LLM as capable as GPT-4 runs on one 3090 by March 2025
29% chance
China will make a LLM approximately as good or better than GPT4 before 2025
80% chance