
Will we see most new language models shifting to addition-only architectures like BitNet/BitNet 1.58b in 2024?
3
100Ṁ85Jan 1
27%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
By 2025 will there be a competitive large language model with >50% of the total training data generated from a large language model?
75% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
79% chance
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
14% chance
Will any language model trained without large number arithmetic be able to generalize to large number arithmetic by 2026?
51% chance
Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
38% chance
By 2028, will there be a language model of less than 10B parameters that is superior to GPT-4?
84% chance
Will we learn by EOY 2024 that large AI labs use something like activation addition on their best models?
23% chance
By 2030, will large language models still be at the peak of AI? [DRAFT]
30% chance
Most popular language model from OpenAI competitor by 2026?
38% chance
By the start of 2026, will I still think that transformers are the main architecture for tasks related to natural language processing?
68% chance