Will OpenAI release a tokenizer with vocab size > 150k in 2023?
20
620Ṁ6510
resolved Jan 1
Resolved
NO
  1. The GPT-2 model used r50k_base: vocab size = 50k

  2. The GPT-3 model used r50k_base: vocab size = 50k

  3. The GPT-3.5 model used cl100k_base: vocab size = 100k

  4. The GPT-4 model used cl100k_base: vocab size = 100k

Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ133
2Ṁ52
3Ṁ38
4Ṁ36
5Ṁ17
Sort by:

Can this be resolved early? Looks like it'll be a NO...

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules