On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
41
1kṀ57102026
84%
chance
1H
6H
1D
1W
1M
ALL
Tracking external bet: https://www.isattentionallyouneed.com/
This will resolve according to that site, whatever the conditions and caveats they have agreed on.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Will there be over 10,000 Optimus robots working at Tesla before 2027?
12% chance
Will Optimus be used on the Optimus assembly line before the end of 2025?
20% chance
Which AI will be the best at the end of 2025?
Will any model get above human level on the Simple Bench benchmark before September 1st, 2025.
44% chance
Will transformers still be the dominant DL architecture in 2026?
81% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
63% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
80% chance
On LMSys, what will be the difference between the top model and GPT-4-1106 on Jan 1 2026?
By EOY 2025, will the model with the lowest perplexity on Common Crawl will not be based on transformers?
10% chance
Will superposition in transformers be mostly solved by 2026?
73% chance
Sort by:
https://blog.rwkv.com/p/eagle-7b-soaring-past-transformers
It even links the wager page in its bragging points:
All while being an “Attention-Free Transformer”
People are also trading
Related questions
Will there be over 10,000 Optimus robots working at Tesla before 2027?
12% chance
Will Optimus be used on the Optimus assembly line before the end of 2025?
20% chance
Which AI will be the best at the end of 2025?
Will any model get above human level on the Simple Bench benchmark before September 1st, 2025.
44% chance
Will transformers still be the dominant DL architecture in 2026?
81% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
63% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
80% chance
On LMSys, what will be the difference between the top model and GPT-4-1106 on Jan 1 2026?
By EOY 2025, will the model with the lowest perplexity on Common Crawl will not be based on transformers?
10% chance
Will superposition in transformers be mostly solved by 2026?
73% chance