
This question resolves yes if it is public knowledge that any ML model is trained using more than 3.14E+23 flops entirely using AMD GPUs (or, hypothetically, other ML accelerators produced by AMD in the future). Resolution is based on announce time; if the model is trained before but only announced later, this resolves NO.
If the trained model is substantially worse than such a model should be, then it does not count towards resolution (i.e if a LM is trained and it's only comparable in performance on standard benchmarks to a model trained with 1/10th the compute). This is mostly intended to exclude failed attempts.
Update 2026-01-24 (PST) (AI summary of creator comment): The creator will resolve this market NO in a few weeks if no qualifying models are brought to their attention.
🏅 Top traders
| # | Trader | Total profit |
|---|---|---|
| 1 | Ṁ1,072 | |
| 2 | Ṁ966 | |
| 3 | Ṁ809 | |
| 4 | Ṁ257 | |
| 5 | Ṁ107 |
People are also trading
This doesn't quite qualify, but is close. This model is trained with more flop than gpt-3 on a mix of amd gpus and tpus: https://www.essential.ai/research/rnj-1
Gpt 3 isn't that big, the tinygrad work seems promising, we have all of 2025