Will OpenAI’s claims that DeepSeek is a distillation of their models become the consensus view?
Basic
6
Ṁ40Dec 31
66%
chance
1D
1W
1M
ALL
https://www.ft.com/content/a0dfedd1-5255-4fa9-8ccc-1fe01de87ea6
OpenAI is alleging that DeepSeek is a distillation of the GPT models. Will this become the consensus view?
This question resolved as “yes” if I judge the majority of credible commentators to hold this view. If there’s reasonable disagreement but a clear consensus, it still resolves as “yes”. I’ll weight my judgment of the consensus heavily on what Zvi reports: https://open.substack.com/pub/thezvi
If there’s no consensus by the time this question closes, it resolves as “no”, even if it’s plausible that a consensus may be reached shortly thereafter.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
Did DeepSeek lie about the GPU compute budget they used in the training of v3?
18% chance
Did DeepSeek violate OpenAI's terms of service by using OpenAI model outputs for distillation in 2024 or January 2025?
51% chance
Will there be an open replication of DeepSeek v3 for <$10m?
51% chance
Will xAI be ahead of DeepSeek on June 30
64% chance
will DeepSeek become a closed AI lab by EOY?
27% chance
Will I believe in 1 year that DeepSeek R1 was substantially trained via distillation of a US model?
54% chance
Will ANY DeepSeek model cause or materially enable a catastrophic risk by 2027?
13% chance
Will OpenAI release next-generation models with varying capabilities and sizes?
64% chance
Did DeepSeek receive unannounced assistance from OpenAI in the creation of their v3 model?
9% chance
Will OpenAI models achieve ≥90% on SimpleBench by the end of 2025?
49% chance