Will a language model comparable to GPT-4 be trained, with ~1/10th the amount of energy it took train GPT-4, by 2028?
24
1kṀ2406
2027
92%
chance

Since the title has char limitations, here is the full question:

Will a model comparable to, or better than, GPT-4 be trained, with ~1/10th the amount of energy it took to train GPT-4, by 2028?

Many models will be kept secret, and their training details will prove hard to estimate. We will try our best to get an estimate. If its roughly within one OOM of required threshold, it'll count.

The question resolves in its spirit of whether low energy high efficiency models will be trained, than about whether it'll be 1/10th or 1/9th the energy.

Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.TermsPrivacy