Will a Mamba 7b model trained on 2 trillion tokens outperform Llama2-13B
Plus
21
Ṁ738Jul 1
66%
chance
1D
1W
1M
ALL
Question will resolve positive if someone trains a Mamba (https://twitter.com/tri_dao/status/1731728602230890895) language model with <=7.5billion parameters on <=2 trillion tokens that outperforms Llama2-13B on the huggingface open llm leaderboard (https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
What will be true of the first model to cross 1400 on lmarena.ai?
Will a 15 billion parameter LLM match or outperform GPT4 in 2024?
24% chance
Will the Jan 2024 version of the LLM detector "Binoculars" be effective against OpenAI's best model at end 2024?
59% chance
Will a open source pure Mamaba LLM surpass 82 MMLU on MMLU (5-shot) before end of year 2024?
25% chance
When will OpenAI release a more capable LLM?
Will Llama 3-multimodal be natively mixed-multimodal? (VQ-VAE+next token prediction)
50% chance
Will any open-source model achieve GPT-4 level performance on MMLU through 2024?
83% chance
Will the next major LLM by OpenAI use a new tokenizer?
76% chance
Will the next LLM released by OpenAI be worse than GPT-4 at MMLU?
16% chance
Will a single model running on a single consumer GPU (<1.5k 2020 USD) outperform GPT-3 175B on all benchmarks in the original paper by 2025?
86% chance