
A major ML paper demonstrates symbolic-enhanced transformer successor outperforming standard transformers by March 2025
17
1kṀ4106resolved Apr 2
Resolved
NO1H
6H
1D
1W
1M
ALL
Will a published machine learning paper demonstrate a new architecture that combines transformers with symbolic methods (category theory, programming language theory, or logic theory) and achieves superior performance on standard benchmarks compared to traditional transformer-only architectures?
Resolution criteria: Paper must be published on arXiv, and very much preferably a major ML conference/journal, and show statistically significant improvements over baseline transformers on multiple standard tasks.
This question is managed and resolved by Manifold.
Get  1,000 to start trading!
1,000 to start trading!
🏅 Top traders
| # | Name | Total profit | 
|---|---|---|
| 1 | Ṁ458 | |
| 2 | Ṁ214 | |
| 3 | Ṁ124 | |
| 4 | Ṁ116 | |
| 5 | Ṁ80 | 
People are also trading
Related questions
Will Transformer based architectures still be SOTA for language modelling by 2026?
91% chance
Will a big transformer LM compose these facts without chain of thought by 2026?
53% chance
Will transformers still be the dominant DL architecture in 2026?
81% chance
On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
84% chance
Will a big transformer LM compose these facts without chain of thought by 2026? (harder question version)
43% chance
If OpenAI makes a transformer sized  advancement in the next 5 years, will they publish an accompanying paper?
45% chance
By the start of 2026, will I still think that transformers are the main architecture for tasks related to natural language processing?
90% chance
LLMs by EOY 2025: Will Retentive Learning Surpass Transformers? (Subsidised 400 M$)
10% chance
Will superposition in transformers be mostly solved by 2026?
73% chance
My probability in 2026 that training transformer LMs will eventually lead to inner misalignment issues
59% chance