On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
41
1kṀ57102026
84%
chance
1H
6H
1D
1W
1M
ALL
Tracking external bet: https://www.isattentionallyouneed.com/
This will resolve according to that site, whatever the conditions and caveats they have agreed on.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Will any model get above human level on the Simple Bench benchmark before September 1st, 2025.
4% chance
Google DeepMind announces a model that outperforms humans on the ARC-AGI-2 benchmark before January 15, 2026
21% chance
Will an AI achieve >85% performance on the FrontierMath benchmark before 2027?
45% chance
Will transformers still be the dominant DL architecture in 2026?
81% chance
On LMSys, what will be the difference between the top model and GPT-4-1106 on Jan 1 2026?
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
63% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
80% chance
By EOY 2025, will the model with the lowest perplexity on Common Crawl will not be based on transformers?
10% chance
Will superposition in transformers be mostly solved by 2026?
73% chance
When will a non-Transformer model become the top open source LLM?
Sort by:
https://blog.rwkv.com/p/eagle-7b-soaring-past-transformers
It even links the wager page in its bragging points:
All while being an “Attention-Free Transformer”
People are also trading
Related questions
Will any model get above human level on the Simple Bench benchmark before September 1st, 2025.
4% chance
Google DeepMind announces a model that outperforms humans on the ARC-AGI-2 benchmark before January 15, 2026
21% chance
Will an AI achieve >85% performance on the FrontierMath benchmark before 2027?
45% chance
Will transformers still be the dominant DL architecture in 2026?
81% chance
On LMSys, what will be the difference between the top model and GPT-4-1106 on Jan 1 2026?
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
63% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
80% chance
By EOY 2025, will the model with the lowest perplexity on Common Crawl will not be based on transformers?
10% chance
Will superposition in transformers be mostly solved by 2026?
73% chance
When will a non-Transformer model become the top open source LLM?