Will any open-source Transformers LLM model that function as a dense mixture of experts be released by end of 2024?
➕
Plus
5
Ṁ2268
Jan 1
9%
chance

Will any open source or weights Transformers LLM based model emerge that is functionally a dense version of mixture of experts where the empirical mathematical sparsity resembles dense models like Llama 3.1 405B or Mistral Large Enough. A tool that allows for the creation of this type of model even if no model is released along with it would resolve as yes as long as it is possible to create the model for example Mergekit for various ways of model manipulation. A paper would only resolve as yes if there was an accompanying model, functional code released, or implementation by a third party.

Get
Ṁ1,000
and
S3.00
Sort by:

Is this even possible? I don't get it

bought Ṁ250 NO

@Kearm20 This hasn't happened so far right? Mixtral 8x7B and co don't count? they aren't "dense"?

@Bayesian It is technically "possible" but it looks like research has moved onto inference time compute and mixture of agents.

@Kearm20 So you're not aware of any reason this market would resolve YES

?*

@Bayesian No. There was some promising early work that I followed but it seemingly has fallen out of favor for transformers LLMs as Qwen QwQ and other work has taken the spotlight. If a lab drops a dense MoE before then end of the year though it would resolve as a yes and we still have two weeks.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules