In 2020, Joe Carlsmith estimated that 10^15 FLOPS is "enough" to perform human functionality. The next post estimating this number will believe it is >=10^15
12
1kṀ256May 25
32%
chance
1D
1W
1M
ALL
In OpenPhil's 2020 report https://www.openphilanthropy.org/brain-computation-report , Joe Carlsmith writes
> Overall, I think it more likely than not that 10^15 FLOP/s is enough to perform tasks as well as the human brain (given the right software, which may be very hard to create). And I think it unlikely (<10%) that more than 10^21 FLOP/s is required. But I’m not a neuroscientist, and there’s no consensus in neuroscience (or elsewhere).
This question will resolve when someone posts an estimate >1.5k words with >10 citations. If no such estimate is posted by May 24th 2024, I will resolve YES because `>=10^15` will still be the standing estimate.
See also https://aiimpacts.org/brain-performance-in-flops
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
At least one of the most powerful neural nets at end of 2026 will be trained using 10^27 FLOPs
83% chance
At least one of the most powerful neural nets at end of 2030 will be trained using 10^27 FLOPs
90% chance
At least one of the most powerful neural nets at end of 2030 will be trained using 10^26 FLOPs
89% chance
Will there be an announcement of a model with a training compute of over 1e30 FLOPs by the end of 2025?
5% chance
At least one of the most powerful neural nets at end of 2026 will be trained using 10^26 FLOPs
86% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2026?
57% chance
Will any supercomputer reach 1 zettaFLOP before 2035?
82% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2025?
77% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2029?
82% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2030?
66% chance