![](/_next/image?url=https%3A%2F%2Fstorage.googleapis.com%2Fmantic-markets.appspot.com%2Fcontract-images%2FPaul%2Ffs3kkt2tgo.jpg&w=3840&q=75)
Will AI accelerators improve in FLOPs/watt by 100x of an NVidia H100 by 2033?
Mini
16
Ṁ2.2k2033
92%
chance
1D
1W
1M
ALL
Compared to an H100, will tensor TFLOPs/Watt improve by 100x by 2033? AI accelerators in scope for this question must be deployed significantly - with at least 100k units or $100M (in 2024 dollars) in production, and have published perf/watt numbers.
This market will count peak FLOPs/watt at k bits of precision, adjusted by a factor of 2^(1 - 32/k). That is, 16 bit precision counts 1/4 as much as 32 bit, which counts 1/4 as much as 64 bit precision.
Get Ṁ600 play money
Related questions
Sort by:
Related questions
When will a US government AI run overtake private AI compute by FLOP?
Will software-side AI scaling appear to be suddenly discontinuous before 2025?
22% chance
What will be the maximum achievable flop utilization on the next generation of Nvidia server chips?
Will AGI be powered by Nvidia GPUs?
57% chance
Will 2024 be the year when AI capabilities progress from AI hardware scaling hits a wall?
28% chance
At least one of the most powerful neural nets at end of 2026 will be trained using 10^27 FLOPs
77% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2025?
63% chance
Will AMD release a product that is competitive with NVIDIA in the AI hardware accelerator space before 2028?
76% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2027?
52% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2028?
44% chance