Will I lead a completed pretraining of a >=1B param language model before EOY 2024?
1
14
Ṁ80Ṁ50
2025
87%
chance
1D
1W
1M
ALL
Must be trained on at least 100B tokens, and start from random initialization. Distillation is okay only if it meets these requirements.
I'll cast an initial guess vote and then no longer participate further in this market
Get Ṁ200 play money
Related questions
Will a large language model beat a super grandmaster playing chess by 2028?
48% chance
Will there be an AI language model that surpasses ChatGPT and other OpenAI models before the end of 2024?
30% chance
Will it cost less than 100k USD to train and run a language model that outperforms GPT-3 175B on all benchmarks by the end 2024?
85% chance
Will any product built using a large language model receive FDA clearance by the end of 2024?
18% chance
Will a Large Language Model be deployed on a mission to land on a moon or planet by the end of 2030?
27% chance
Will there be an LLM which can do fluent conlang translations by EOY 2024?
67% chance
By 2025 will there be a competitive large language model with >50% of the total training data generated from a large language model?
75% chance
Will a language model that runs locally on a consumer cellphone beat GPT4 by EOY 2026?
48% chance
By 2028, will there be a language model of less than 10B parameters that is superior to GPT-4?
81% chance
Will $10,000 worth of AI hardware be able to train a GPT-3 equivalent model in under 1 hour, by EOY 2027?
16% chance