Is attention all you need? (transformers SOTA in 2027)
111
1.3K
3.6K
2027
45%
chance

This market simulates the wager between Jonathan Frankie (@jefrankle) and Sasha Rush (@srush_nlp)

Details can be fount at https://www.isattentionallyouneed.com/

Proposition

On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmarked tasks in natural language processing.

Other markets on the same:

Get Ṁ200 play money
Sort by:
predicts YES

Yes given that an architecture qualifies that levers a combination of transformer models and supporting infra components that wouldn’t be considered breakthrough technologies on their own (e.g. RAG).

So do mixtures of experts count? The linked page this not contain any actual details.

predicts YES

@EchoNolan I talked to Sasha, and his response is basically that as long as the E in the MoE is Transformer, its a transformer.

bought Ṁ15 of NO

i have strong principled reasons this should stay at 50% for the next 24 hours

bought Ṁ3 of YES
bought Ṁ10 of NO

subsidy phasein

predicts YES

@jacksonpolack Hm, I will add in subsidy at a later point wherever the market stabilizes to maintain that

More related questions