On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
38
1kṀ41902026
80%
chance
1D
1W
1M
ALL
Tracking external bet: https://www.isattentionallyouneed.com/
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Sort by:
https://blog.rwkv.com/p/eagle-7b-soaring-past-transformers
It even links the wager page in its bragging points:
All while being an “Attention-Free Transformer”
Related questions
Related questions
A major ML paper demonstrates symbolic-enhanced transformer successor outperforming standard transformers by March 2025
16% chance
Will transformers still be the dominant DL architecture in 2026?
80% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
55% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
79% chance
Will any model get above human level on the Simple Bench benchmark before September 1st, 2025.
68% chance
On LMSys, what will be the difference between the top model and GPT-4-1106 on Jan 1 2026?
By EOY 2025, will the model with the lowest perplexity on Common Crawl will not be based on transformers?
27% chance
Will superposition in transformers be mostly solved by 2026?
73% chance
When will a non-Transformer model become the top open source LLM?
By the start of 2026, will I still think that transformers are the main architecture for tasks related to natural language processing?
68% chance