Will there be an open replication of DeepSeek v3 for <$10m?
➕
Plus
12
Ṁ2261
Dec 31
51%
chance

Background

DeepSeek-V3 is a large language model that DeepSeek claims was trained using 2.788 million H800 GPU hours, costing approximately $5.576 million in direct training costs. This is notably efficient compared to similar models like Llama 3.1 405B, which required over 11 times more GPU hours.

Resolution Criteria

This market will resolve YES if:

  • A team openly publishes reproducible or otherwise credible replication of DeepSeek-V3 with comparable performance

  • The compute cost is less than $10 million

The market will resolve NO if:

  • No successful replication is achieved by the resolution date

  • A replication is achieved but costs $10 million or more

  • A replication is achieved but there are no credible reports of the costs

Considerations

The cost of GPU compute will be calculated using December 2024 prices.

Get
Ṁ1,000
and
S3.00
Sort by:

Would a Llama model with similar claimed costs result in a Yes resolution?

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules