Will anyone train a TokenFormer model at scale before 2026?
Plus
2
Ṁ7252026
25%
chance
1D
1W
1M
ALL
Will anyone train a TokenFormer model using at least (the equivalent of) 200,000 H100-hours before 2026?
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
Will a new lab create a top-performing AI frontier model before 2028?
59% chance
Before 2028, will there be enough inference capacity to generate 30T frontier model tokens per day?
40% chance
Will a flagship (>60T training bytes) open-weights LLM from Meta which doesn't use a tokenizer be released in 2025?
29% chance
Will a model costing >$30M be intentionally trained to be more mechanistically interpretable by end of 2027? (see desc)
57% chance
Will models be able to do the work of an AI researcher/engineer before 2027?
33% chance
Will there be a more sample-efficient pretraining algorithm than next token prediction for NLP before 2027?
43% chance
10GW AI training run before 2029?
43% chance
Will OpenAI release a tokenizer with more than 210000 tokens before 2026?
24% chance
Will a GPT-3 quality model be trained for under $10.000 by 2030?
83% chance
Before 2028, will any AI model achieve the same or greater benchmarks as o3 high with <= 1 million tokens per question?
69% chance