Resolves YES is any model in the Llama 4 (or potentially 4.1) series uses mixture of experts.
Get Ṁ1,000 play money
Related questions
Related questions
Will there exist a service for full-parameter fine-tuning of Llama 3.1 405B?
80% chance
Will Llama 4 be fully open-weight?
70% chance
Will Llama 4 be the best LLM in the chatbot arena?
22% chance
Will Llama-4 be (open sourced and) as good as GPT-4?
64% chance
Is GPT-5 a mixture of experts?
79% chance
How many active parameters will the largest Llama 3 have?
77% chance
Will Llama 3-multimodal be natively mixed-multimodal? (VQ-VAE+next token prediction)
50% chance