At least one of the most powerful neural nets at end of 2026 will be trained using 10^26 FLOPs
16
1kṀ23k
resolved Apr 15
Resolved
YES

Resolves YES if at least one of the most powerfull neural nets publicly known to exist by end of 2026 was trained using at least 10^26 FLOPs. This is ~3 exaFLOP/s years. It does not matter if the compute is distributed, as long as one of the largest models used it. A neural net which uses 10^26 FLOPs but is inferior to other models does not count. Low precision floating point such as fp32, fp16, or fp8 is permitted.

Resolves NO if no such model exists by end of 2026.

If we have no good estimates of training compute usage of top models, resolves N/A.

Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ212
2Ṁ83
3Ṁ55
4Ṁ23
5Ṁ15
© Manifold Markets, Inc.TermsPrivacy