Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
➕
Plus
9
Ṁ915
2026
55%
chance

Inspired by this tweet: https://x.com/aidan_mclau/status/1859444783850156258

The claim here appears to be that labs have trained very large base models (unclear how large) but cannot instruct tune them. If this is a real phenomenon that cannot be quickly overcome, AI development from here seems like it will be very strange.

This market resolves YES if a model is released before January 1, 2026 that is confirmed to have 10 trillion parameters and follows instructions (e.g. it is not a base model). Labs are not eager to release parameter counts - it is still not clear how many parameters Claude 3 Opus has, despite being released in February 2024. As a result, this market may not resolve until long after January 1, 2026. However, I will resolve it as NO early if I judge that any model released before then is very unlikely to have this many parameters (for example, if they are all very fast or have similar pricing to previous models). There is some subjectivity here, so I will not trade on this market.

Get
Ṁ1,000
and
S3.00
Sort by:
bought Ṁ50 NO

Tweet seems like BS. Though given the way things are going it seems like making a model that large has almost no benefit so I doubt it would happen

twitter claim is total BS! nothing differentiates very large parameter models from the ones we have today. like any other model they are trained by gradient descent to optimize the next-token prediction objective. any dissenting weights will be quickly optimized away, no matter the objective - base or instruct

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules