Will software-side AI scaling appear to be suddenly discontinuous before 2025?
34
1kṀ6918
resolved Apr 12
Resolved
YES

This market is taken from a comment by @L. I assume by "software side scaling" they mean something like "discontinuous progress in algorithmic efficiency", and that is what I will use to resolve this market.

See this paper for an explanation of algorithmic efficiency: https://arxiv.org/abs/2005.04305. tldr: the efficiency of an algorithm is the number of FLOPs using that algorithm required to achieve a given target.

Current estimates are that algorithmic efficiency doubles every ~2 years. This market resolves YES if there is a 10x algorithmic efficiency gain within any six month period before 2025, for a SOTA model in a major area of ML research (RL, sequence generation, translation, etc)

It must be a SOTA model on a reasonably broad benchmark - meaning it takes 1/10 the FLOPS to achieve SOTA performance at the end of the six month period, and it can't be, for instance, performance on a single Atari environment or even a single small language benchmark.

In short: it needs to be such a significant jump that no one can reasonably argue that there wasn't a massive jump, and it needs to be on something people actually care about.

I am using "algorithmic efficiency" rather than "increase in SOTA" because it's harder to define "discontinuity" across final performance. "2x increase" is perfectly reasonable in RL but nonsensical for a classification task where SOTA is already 90%.

Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ240
2Ṁ170
3Ṁ156
4Ṁ130
5Ṁ116
© Manifold Markets, Inc.TermsPrivacy