Will an LLM Built on a State Space Model Architecture Have Been SOTA at any Point before EOY 2027? [READ DESCRIPTION]
12
101
แน201แน475
2027
58%
chance
1D
1W
1M
ALL
I don't mean "achieves SOTA on one benchmark", or "is the best FOSS model", I mean "is the equivalent of what GPT-4 is right now".
The SSM must be in contention for the position as the most generally capable LLM. I will not trade in this market because the resolution condition isn't entirely objective.
Get แน200 play money
Sort by:
@HanchiSun I think he means like Mamba: https://arxiv.org/pdf/2312.00752.pdf
They are vaguely related to RNNs though
More related questions
Related questions
Will Google have the best LLM by EOY 2024?
32% chance
By 2026, will it be standard practice to sandbox SOTA LLMs?
26% chance
Will an LLM be able to solve the Self-Referential Aptitude Test before 2027?
67% chance
Will any LLM released by EOY 2025 be dangerously ASL-3 as defined by Anthropic?
46% chance
Will a LLM/elicit be able to do proper causal modeling (identifying papers that didn't control for covariates) in 2024?
41% chance
Will any significant interpretability result for a SOTA Transformer architecture be published by 2025?
87% chance
Will a SOTA model be trained with Kolmogorov-Arnold Networks by 2029?
14% chance
Will there be an LLM which can do fluent conlang translations by EOY 2024?
67% chance
Will an open sourced SOTA LLM be trained on Intel hardware by 2024?
14% chance
Will any foundation models/LLMs be able to reliably come up with novel unparalleled misalignments before EOY 2024?
41% chance