
Will anyone train a TokenFormer model at scale before 2026?
2
1kṀ7252026
25%
chance
1H
6H
1D
1W
1M
ALL
Will anyone train a TokenFormer model using at least (the equivalent of) 200,000 H100-hours before 2026?
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
Before 2028, will there be enough inference capacity to generate 30T frontier model tokens per day?
99% chance
Before 2028, will anyone train a GPT-4-level model in a minute?
21% chance
Will a flagship (>60T training bytes) open-weights LLM from Meta which doesn't use a tokenizer be released in 2025?
20% chance
Will OpenAI release a tokenizer with more than 210000 tokens before 2026?
24% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
15% chance
AI: Will someone train a $1B model by 2028?
81% chance
Will we see a public GPU compute sharing pool for LLM model training or inference before 2026 ?
86% chance
Will a model be trained using at least as much compute as GPT-3 using AMD GPUs before Jan 1 2026?
84% chance
Will a model as great as GPT-5 be available to the public in 2025?
99% chance
Will a GPT-4 quality model be trained for under $10.000 by 2030?
86% chance