Will Meta stop sharing LLM weights in 2024?
29
570Ṁ4502resolved May 3
Resolved
NO1H
6H
1D
1W
1M
ALL
This market resolves NO if Meta releases a language model more powerful than Llama 2 (specifically, llama2-70b-chat) for public download during the year of 2024, similar to how Llama 2 is currently available for public download at https://ai.meta.com/llama/. It resolves YES otherwise. A release before Jan 1 2024 does not trigger a NO resolution.
See also https://www.governance.ai/research-paper/open-sourcing-highly-capable-foundation-models and metaprotest.org.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ152 | |
2 | Ṁ70 | |
3 | Ṁ33 | |
4 | Ṁ28 | |
5 | Ṁ21 |
People are also trading
Related questions
Will a flagship (>60T training bytes) open-weights LLM from Meta which doesn't use a tokenizer be released in 2025?
20% chance
Will Meta ever deploy its best LLM without releasing its model weights up through AGI?
75% chance
OpenAI to release model weights by EOY?
83% chance
Will Meta's Llama family of models reach 2 billion downloads by September 30, 2025?
75% chance
Will there be a successful application of diffusion-like weight modification in LLMs before 2027?
27% chance
Will Meta Threads still be around in 2028?
60% chance
Will Meta censor its future open weights models according to Chinese-developed techniques?
32% chance
When will the weights for Apple's on-device LLM be leaked?
5/23/26
When will Meta release Llama 5?
2/3/26
Will Meta’s stock split before 2026?
18% chance