A major ML paper demonstrates symbolic-enhanced transformer successor outperforming standard transformers by March 2025
Plus
15
Ṁ1495Mar 31
21%
chance
1D
1W
1M
ALL
Will a published machine learning paper demonstrate a new architecture that combines transformers with symbolic methods (category theory, programming language theory, or logic theory) and achieves superior performance on standard benchmarks compared to traditional transformer-only architectures?
Resolution criteria: Paper must be published on arXiv, and very much preferably a major ML conference/journal, and show statistically significant improvements over baseline transformers on multiple standard tasks.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
Will an AI achieve >85% performance on the FrontierMath benchmark before 2028?
72% chance
On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
54% chance
Will superposition in transformers be mostly solved by 2026?
73% chance
Will a big transformer LM compose these facts without chain of thought by 2026?
64% chance
If OpenAI makes a transformer sized advancement in the next 5 years, will they publish an accompanying paper?
45% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
70% chance
Will a big transformer LM compose these facts without chain of thought by 2026? (harder question version)
53% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
55% chance
Will transformers still be the dominant DL architecture in 2026?
61% chance
Will an AI achieve >85% performance on the FrontierMath benchmark before 2027?
64% chance