Is attention all you need? (transformers SOTA in 2027)
129
3.6kṀ25k
2027
61%
chance

This market simulates the wager between Jonathan Frankie (@jefrankle) and Sasha Rush (@srush_nlp)

Details can be fount at https://www.isattentionallyouneed.com/

Proposition

On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmarked tasks in natural language processing.

Other markets on the same:

Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.TermsPrivacy