
Will we see most new language models shifting to addition-only architectures like BitNet/BitNet 1.58b in 2024?
3
100Ṁ85resolved Dec 27
Resolved
NO1H
6H
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
| # | Name | Total profit |
|---|---|---|
| 1 | Ṁ26 | |
| 2 | Ṁ22 |
People are also trading
Related questions
Will there be an AI language model that strongly surpasses ChatGPT and other OpenAI models at the end of 2025?
92% chance
By the end of 2026, will we have transparency into any useful internal pattern within a Large Language Model whose semantics would have been unfamiliar to AI and cognitive science in 2006?
10% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
97% chance
Will any language model trained without large number arithmetic be able to generalize to large number arithmetic by 2026?
17% chance
Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
9% chance
Most popular language model from OpenAI competitor by 2026?
13% chance
By the start of 2026, will I still think that transformers are the main architecture for tasks related to natural language processing?
91% chance
Will all of the publicly accessible parts of heavengames.com/aok.heavengames.com become part of a large language model like Claude or GPT by 2025?
51% chance
Will AI (large language models) collapse by may 2026?
11% chance
By 2030, will large language models still be at the peak of AI? [DRAFT]
25% chance