
Inspired by this tweet: https://x.com/aidan_mclau/status/1859444783850156258
The claim here appears to be that labs have trained very large base models (unclear how large) but cannot instruct tune them. If this is a real phenomenon that cannot be quickly overcome, AI development from here seems like it will be very strange.
This market resolves YES if a model is released before January 1, 2026 that is confirmed to have 10 trillion parameters and follows instructions (e.g. it is not a base model). Labs are not eager to release parameter counts - it is still not clear how many parameters Claude 3 Opus has, despite being released in February 2024. As a result, this market may not resolve until long after January 1, 2026. However, I will resolve it as NO early if I judge that any model released before then is very unlikely to have this many parameters (for example, if they are all very fast or have similar pricing to previous models). There is some subjectivity here, so I will not trade on this market.