Will there be an LLM (as good as GPT-4) that was trained with 1/10th the energy consumed to train GPT-4, by 2026?
21
169
1.4k
2026
84%
chance

The total power consumption could be estimated to be around 50-60 million kWh for training GPT-4.

1/10th of this energy = 5-6 million kWh

1/100th of this energy = 0.5-0.6 million kWh

See calculations below:

Get Ṁ600 play money
Sort by:
bought Ṁ500 YES

Jensen Huang (CEO of NVIDIA) said that with Blackwell GPUs, you could train GPT-4 with only about 4 MW of power consumed. Looks like even without algorithmic improvements we can get there.

bought Ṁ200 of YES

Approx 4x efficiency improvement from silicon alone, based on latest GPUs being announced now (specifically mi300x Vs a100).

More related questions