Get Ṁ200 play money
Related questions
Sort by:
@Sss19971997 Perhaps, but more likely we'll see 8x7B MoE (like Mixtral) and also a 70B dense model.
In that case, do you think this should resolve no?
Related questions
What month will Llama 3 400B+ be released?
Will Llama-3 (or next open Meta model) be obviously good in its first-order effects on the world?
81% chance
How many active parameters will the largest Llama 3 have?
75% chance
Will the 400B+ open source Llama 3 model rank higher than GPT-4-Turbo-2024-04-09 on the lmsys leaderboard?
53% chance
Will Llama-4 be (open sourced and) as good as GPT-4?
74% chance
Will Llama-3 be multimodal?
84% chance
Will Llama 3 400B be better than GPT-4?
50% chance
When will Llama 2 start offering a paid subscription service?
Will Meta's stock rise in the month after Llama 3 is released?
37% chance
Will Llama 3 405B be open sourced
89% chance