Will we see most new language models shifting to addition-only architectures like BitNet/BitNet 1.58b in 2024?
Basic
2
Ṁ35Jan 1
43%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
54% chance
Will there be an AI language model that strongly surpasses ChatGPT and other OpenAI models before the end of 2024?
9% chance
By the end of 2026, will we have transparency into any useful internal pattern within a Large Language Model whose semantics would have been unfamiliar to AI and cognitive science in 2006?
37% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
68% chance
Will any language model trained without large number arithmetic be able to generalize to large number arithmetic by 2026?
51% chance
Will Meta release an open source language model that outperforms GPT-4 by the end of 2024
63% chance
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
26% chance
By 2025 will there be a competitive large language model with >50% of the total training data generated from a large language model?
75% chance
Will a language model that runs locally on a consumer cellphone beat GPT4 by EOY 2026?
70% chance
Will Scaling Laws for Neural Language Model continue to hold till the end of 2027?
67% chance