Is attention all you need? (transformers SOTA in 2027)
Plus
127
Ṁ24k2027
48%
chance
1D
1W
1M
ALL
This market simulates the wager between Jonathan Frankie (@jefrankle) and Sasha Rush (@srush_nlp)
Details can be fount at https://www.isattentionallyouneed.com/
Proposition
On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmarked tasks in natural language processing.
Other markets on the same:
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
What about hybrid models, like Jamba? They might be the best of both worlds.
bought Ṁ4 YES at 61%
predicts YES
@EchoNolan I talked to Sasha, and his response is basically that as long as the E in the MoE is Transformer, its a transformer.
@jacksonpolack Hm, I will add in subsidy at a later point wherever the market stabilizes to maintain that