On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
Plus
35
Ṁ28302026
54%
chance
1D
1W
1M
ALL
Tracking external bet: https://www.isattentionallyouneed.com/
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
https://blog.rwkv.com/p/eagle-7b-soaring-past-transformers
It even links the wager page in its bragging points:
All while being an “Attention-Free Transformer”
Related questions
Related questions
Will an AI achieve >85% performance on the FrontierMath benchmark before 2028?
61% chance
Will any model get above human level (92%) on the Simple Bench benchmark before September 1st, 2025.
52% chance
By the end of Q1 2025 will an open source model beat OpenAI’s o1 model?
66% chance
Will there be a model that has a 75% win rate against the latest iteration of GPT-4 as of January 1st, 2025?
62% chance
Will transformers still be the dominant DL architecture in 2026?
61% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
55% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
68% chance
When will a non-Transformer model become the top open source LLM?
Will superposition in transformers be mostly solved by 2026?
73% chance
Which AI will be the best at the end of 2024?