
Will any open-source Transformers LLM model that function as a dense mixture of experts be released by end of 2024?
5
1kṀ2268resolved Jan 1
Resolved
NO1H
6H
1D
1W
1M
ALL
Will any open source or weights Transformers LLM based model emerge that is functionally a dense version of mixture of experts where the empirical mathematical sparsity resembles dense models like Llama 3.1 405B or Mistral Large Enough. A tool that allows for the creation of this type of model even if no model is released along with it would resolve as yes as long as it is possible to create the model for example Mergekit for various ways of model manipulation. A paper would only resolve as yes if there was an accompanying model, functional code released, or implementation by a third party.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ607 | |
2 | Ṁ92 | |
3 | Ṁ4 |
People are also trading
Related questions
Will Transformer-Based LLMs Make Up ≥75% of Parameters in the Top General AI by 2030?
47% chance
When will a non-Transformer model become the top open source LLM?
OpenAI to release model weights by EOY?
83% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
80% chance
Are Mixture of Expert (MoE) transformer models generally more human interpretable than dense transformers?
45% chance
When will OpenAI release an open source model?
By EOY 2025, will the model with the lowest perplexity on Common Crawl will not be based on transformers?
10% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
63% chance
Will superposition in transformers be mostly solved by 2026?
73% chance