
Will a flagship (>60T training bytes) open-weights LLM from Meta which doesn't use a tokenizer be released in 2025?
10
Ṁ1kṀ2kresolved Jan 4
Resolved
NO1H
6H
1D
1W
1M
ALL
Resolves YES if Meta releases weights of an LLM trained on at least 60T bytes of data (roughly equivalent to the 15T tokens used to train the Llama 3.1 models) in 2025 which does not use standard fixed-vocabulary tokenization.
A qualifying model must be released under a license roughly as permissive as Llama 3.1.
This market was spurred by recent research from Meta showing a proof-of-concept for a tokenizer-free LLM. A qualifying model from Meta does not need to use the patching technique from this paper as long as it's not using tokenization.
https://ai.meta.com/research/publications/byte-latent-transformer-patches-scale-better-than-tokens/
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
🏅 Top traders
| # | Trader | Total profit |
|---|---|---|
| 1 | Ṁ229 | |
| 2 | Ṁ224 | |
| 3 | Ṁ128 | |
| 4 | Ṁ76 | |
| 5 | Ṁ63 |