Do scaling laws happen because models experience a ton of tiny phase changes which average out to a smooth curve?

Basic

32

Ṁ1.0k2025

48%

chance

1D

1W

1M

ALL

Problem 5.31 from @NeelNanda's 200 COP.

"**D* 5.31 - ** Hypothesis: The reason scaling laws happen is that models experience a *ton *of tiny phase changes, which average out to a smooth curve because of the law of large numbers. Can you find evidence for or against that? Are phase changes everywhere?"

Resolves to the best evidence available by the end of 2024.

Get Ṁ600 play money

Sort by:

## Related questions

Do scaling laws happen because models experience a ton of tiny phase changes which average out to a smooth curve?

59% chance

Will GPT-4 improve on the Chinchilla scaling law?

43% chance

Will GPT-4 be trained (roughly) compute-optimally using the best-known scaling laws at the time?

30% chance

Will there be a major paradigm shift in physics, like Newtonian to Modern Physics, by the end of 2040?

32% chance

Will the source of the Dark Energy effect be found among currently known Standard Model particles?

46% chance

Will Scaling Laws for Neural Language Model continue to hold till the end of 2027?

65% chance

Is gravity fundamentally quantum?

76% chance

Is the Continuum Hypothesis true?

47% chance

Will loss curves on Pythia models of different sizes trained on the same data in the same order be similar?

76% chance

Will software-side AI scaling appear to be suddenly discontinuous before 2025?

18% chance