In 2020, Joe Carlsmith estimated that 10^15 FLOPS is "enough" to perform human functionality. The next post estimating this number will believe it is >=10^15
Plus
12
Ṁ256May 25
32%
chance
1D
1W
1M
ALL
In OpenPhil's 2020 report https://www.openphilanthropy.org/brain-computation-report , Joe Carlsmith writes
> Overall, I think it more likely than not that 10^15 FLOP/s is enough to perform tasks as well as the human brain (given the right software, which may be very hard to create). And I think it unlikely (<10%) that more than 10^21 FLOP/s is required. But I’m not a neuroscientist, and there’s no consensus in neuroscience (or elsewhere).
This question will resolve when someone posts an estimate >1.5k words with >10 citations. If no such estimate is posted by May 24th 2024, I will resolve YES because `>=10^15` will still be the standing estimate.
See also https://aiimpacts.org/brain-performance-in-flops
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
At least one of the most powerful neural nets at end of 2026 will be trained using 10^27 FLOPs
81% chance
When will a US government AI run overtake private AI compute by FLOP?
Assuming SB 1047 is passed, will the "compute threshold" of 10^26 flop be raised before 2030?
64% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2028?
44% chance
Will any supercomputer reach 1 zettaFLOP before 2035?
82% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2026?
57% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2030?
66% chance
At least one of the most powerful neural nets at end of 2026 will be trained using 10^26 FLOPs
86% chance
At least one of the most powerful neural nets at end of 2030 will be trained using 10^26 FLOPs
89% chance
Will a machine learning training run exceed 10^30 FLOP in China before 2035?
34% chance