Is attention all you need? (transformers SOTA in 2027)
129
3.6kṀ25k2027
61%
chance
1H
6H
1D
1W
1M
ALL
This market simulates the wager between Jonathan Frankie (@jefrankle) and Sasha Rush (@srush_nlp)
Details can be fount at https://www.isattentionallyouneed.com/
Proposition
On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmarked tasks in natural language processing.
Other markets on the same:
This question is managed and resolved by Manifold.
Get
1,000 to start trading!