Will Google Open Source a 7b or larger model based on their Titans Architecture before 2026?
6
1kṀ280
2026
47%
chance

Google recently described a new architecture in this paper that seems to be much better at long-context than transformers. https://arxiv.org/abs/2501.00663

Google also released an architecture called Griffin last year and open sourced 2b and 9b models called recurrent Gemma based on it.

Resolves yes if Google open-sources a 7b or larger Titans models before 2026. In the paper they only experimented with toy models smaller than 1b parameters.

Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules