
Will we see most new language models shifting to addition-only architectures like BitNet/BitNet 1.58b in 2024?
3
100Ṁ85Jan 1
27%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
By the end of 2026, will we have transparency into any useful internal pattern within a Large Language Model whose semantics would have been unfamiliar to AI and cognitive science in 2006?
32% chance
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
14% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
79% chance
Will any language model trained without large number arithmetic be able to generalize to large number arithmetic by 2026?
51% chance
Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
43% chance
By 2028, will there be a language model of less than 10B parameters that is superior to GPT-4?
84% chance
Will we learn by EOY 2024 that large AI labs use something like activation addition on their best models?
23% chance
By 2030, will large language models still be at the peak of AI? [DRAFT]
25% chance
Most popular language model from OpenAI competitor by 2026?
36% chance
By the start of 2026, will I still think that transformers are the main architecture for tasks related to natural language processing?
75% chance