Will I believe in 1 year that DeepSeek R1 was substantially trained via distillation of a US model?
Basic
6
Ṁ1142026
54%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
How much did DeepSeek-V3 cost to train?
Will there be an open replication of DeepSeek v3 for <$10m?
51% chance
Did DeepSeek violate OpenAI's terms of service by using OpenAI model outputs for distillation in 2024 or January 2025?
50% chance
Will OpenAI’s claims that DeepSeek is a distillation of their models become the consensus view?
63% chance
When will DeepSeek release R2?
will DeepSeek become a closed AI lab by EOY?
27% chance
"Holy shit!" -> my reaction to deepseek r1. Will I feel the same about any AI developments in the next 5 months?
79% chance
Will ANY DeepSeek model cause or materially enable a catastrophic risk by 2027?
14% chance
Did DeepSeek lie about the GPU compute budget they used in the training of v3?
20% chance
Did DeepSeek use a cluster of more than 5,000 H100s in 2024?
41% chance