Will Google Open Source a 7b or larger model based on their Titans Architecture before 2026?
6
1kṀ2802026
47%
chance
1D
1W
1M
ALL
Google recently described a new architecture in this paper that seems to be much better at long-context than transformers. https://arxiv.org/abs/2501.00663
Google also released an architecture called Griffin last year and open sourced 2b and 9b models called recurrent Gemma based on it.
Resolves yes if Google open-sources a 7b or larger Titans models before 2026. In the paper they only experimented with toy models smaller than 1b parameters.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
Will Google still actively lead and manage the Bazel build system by the end of 2025?
98% chance
Will Googles Titans architecture replace transformers?
12% chance
Will Google create a new internally-developed smartphone model and sell it widely at some point in the year 2025 or 2026?
74% chance
Will Google release a model that refuses to talk about Tiananmen square, before 2026?
10% chance
Will Google join the voluntary commitment by OpenAI/Anthropic to AISI to share major new models w/AISI prior to release?
83% chance
Will OpenAI release next-generation models with varying capabilities and sizes?
64% chance
Will any Google model exceed chatGPT interest? (by 2025)
8% chance
Will OpenAI launch a model even more expensive than o1-pro in 2025?
64% chance
Will google announce the closure of google cloud before 2026
2% chance
If OpenAI open-sources o3-mini*, will it open-source an even more powerful model before July 2026?
50% chance