Is the brain's learning algorithm superior to backpropogation?
3
104
160
2033
80%
chance

If we fail to decode the brain's learning algorithm, this question will resolve ambiguously.


This question will resolve positively if a system trained using the brain's learning algorithm either surpasses the SOTA using backpropogation, or matches the SOTA but uses <1/2 the flops, on a major benchmark before 2033. If it fails to do so, this question resolves negatively.

Examples of marjor benchmarks are:
1) Playing go
2) Imagenet

3) Big Bench

Get Ṁ200 play money
Sort by:

Backpropagation is not an optimization algorithm, it's an algorithm for computing gradients. It's used as a component of many optimization algorithms, but is not itself one. Are you asking if any algorithm that includes backprop will beat the human brain?

predicts YES

@vluzko That's a good point. I was brushing everything under "backprop", which is incorrect. I want to have a market on whether Hinton's conjecture that current NNs have better learning algorithms than the brains. I'm not sure how to cache this out as a question. I might change the question to "if you train an n parameter nural net vs an n-parameter simulation isomorphic to the brain, which one does better on standard ML benchmarks?" But maybe the difference shows up in how both systems scale with parameter count.

I guess another operationalization is that if we decode the brain, and base a new learning system which learns the same way (part of) the brain does, will that system outperform e.g. a SOTA transformer model, or some other ANN? That's still too vague though.


Any recommendations would be appreciated.

bought Ṁ100 of YES

I'm pumping this up.