On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
27
239
Ṁ1.4kṀ795
2026
38%
chance
1D
1W
1M
ALL
Tracking external bet: https://www.isattentionallyouneed.com/
Get Ṁ600 play money
Related questions
Sort by:
https://blog.rwkv.com/p/eagle-7b-soaring-past-transformers
It even links the wager page in its bragging points:
All while being an “Attention-Free Transformer”
More related questions
Related questions
Tesla Cybertruck will be in the top 25 selling car models for 2025
40% chance
6) An alternative to the transformer architecture will see meaningful adoption.
70% chance
Will Mamba be the de-facto paradigm for LLMs over transformers by 2025?
30% chance
Will a big transformer LM compose these facts without chain of thought by 2026? (harder question version)
37% chance
Eliezer Yudkowsky is impressed by a machine learning model, and believes that the model may be very helpful for alignment research, by the end of 2026
29% chance
Will superposition in transformers be mostly solved by 2026?
54% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
66% chance
Will Etched.AI launch a transformer chip product before 2025?
18% chance
Is attention all you need? (transformers SOTA in 2027)
45% chance
Will transformers still be the dominant DL architecture in 2026?
57% chance