How many non-Transformer based models will be in the top 10 on HuggingFace Leaderboard in the 7B range by July?
Plus
16
Ṁ769Jul 2
39%
0
32%
1-2
24%
3-5
4%
6+
For resolution I’ll go to the HuggingFace leaderboard and select the ~7B and uncheck everything else.
I’ll refrain from participating in the market to stay neutral in case a hybrid case comes up. I’d count both Mamba and StripedHyena as non-Transformers.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
@HanchiSun I’d say sliding window is a type of attention. I’d consider LongFormers as a type of Transformer.
@HanchiSun Out of curiosity, would you bet differently if it was for the 3B category rather than 7B?
@KLiamSmith Good question. it is definitely harder to experiment with 7b than 3b. but even for 3b, i doubt more than 2 non-attention architecture will be better
Related questions
Related questions
When will a non-Transformer model become the top open source LLM?
Manifold Top Creators Leaderboard Ranking Prediction (2024)
Manifold Top Traders Leaderboard Ranking Prediction (2024)
On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
54% chance
Will any model get above human level on the Simple Bench benchmark before September 1st, 2025.
55% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
70% chance
Will the top model by OpenAI rank 3rd (or lower) behind 2 other model families at any point before 2026?
41% chance
Number of public models on 🤗Hugging Face by EOY 2024
Will any open-source model rank in the top 3 on Chatbot Arena at the end of 2024?
15% chance
Will China have a model in the top 10 on LMSYS Chatbot Arena on March 1, 2025?
77% chance