Will any language model trained without large number arithmetic be able to generalize to large number arithmetic by 2026?
57
1.2kṀ56442026
46%
chance
1H
6H
1D
1W
1M
ALL
Small number arithmetic in the training set is fine, as is non-arithmetic. "Small" and "large" are relative: if the training set contains arithmetic up to 20 digits and it generalizes to 100 digits, the question resolves yes.
I'll accept a subset of arithmetic as well, e.g. if it can only do large number addition but not multiplication the question resolves yes.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
Will a large language model beat a super grandmaster playing chess by 2028?
53% chance
Will a Large Language Model save a human life through medical advice by the end of 2025?
92% chance
Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
43% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
80% chance
By 2030, will large language models still be at the peak of AI? [DRAFT]
25% chance
Will an AI model use more than 1e28 FLOPS in training before 2026?
8% chance
Will reinforcement learning overtake LMs on math before 2028?
70% chance
Will a Large Language Model be listed as an author on a peer-reviewed paper by the end of 2025?
34% chance
Will a Language Model under 10B parameters play chess at Grandmaster level by 2050?
88% chance
Will Scaling Laws for Neural Language Model continue to hold till the end of 2026?
82% chance