Will Google Open Source a 7b or larger model based on their Titans Architecture before 2026?
9
Ṁ1kṀ2.3kresolved Jan 1
Resolved
NO1H
6H
1D
1W
1M
ALL
Google recently described a new architecture in this paper that seems to be much better at long-context than transformers. https://arxiv.org/abs/2501.00663
Google also released an architecture called Griffin last year and open sourced 2b and 9b models called recurrent Gemma based on it.
Resolves yes if Google open-sources a 7b or larger Titans models before 2026. In the paper they only experimented with toy models smaller than 1b parameters.
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
🏅 Top traders
| # | Trader | Total profit |
|---|---|---|
| 1 | Ṁ1,558 | |
| 2 | Ṁ68 | |
| 3 | Ṁ47 | |
| 4 | Ṁ43 | |
| 5 | Ṁ22 |
People are also trading
Related questions
Will OpenAI announce a new full-size, frontier model >5.2 before March 1, 2026?
78% chance
Will OpenAI announce a new model that EpochAI estimates is at least as large as GPT-4.5, before August 2026?
43% chance
If OpenAI open-sources o3-mini*, will it open-source an even more powerful model before July 2026?
66% chance