Is Attention All You Need?
53
1kṀ41882027
48%
chance
1H
6H
1D
1W
1M
ALL
This market will resolve according to the website below:
https://www.isattentionallyouneed.com/
Proposition:
On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmarked tasks in natural language processing.
For the Motion
Jonathan Frankle
@jefrankle
Harvard Professor
Chief Scientist Mosaic ML
Against the Motion
Sasha Rush
@srush_nlp
Cornell Professor
Research Scientist Hugging Face 🤗
Context
Coming soon
Similar market:
This question is managed and resolved by Manifold.
Get
1,000 to start trading!