Will the transformer architecture be replaced in SOTA LLMs by 2028?
12
81
270
2028
63%
chance

This market aims to forecast the likelihood that the transformer architecture, which currently underpins most state-of-the-art large language models (LLMs) like GPT-3.5, will be replaced by a different architectural paradigm by 2028. Transformers have revolutionized the field of natural language processing with their ability to handle long-range dependencies and context. However, as the field of AI progresses, new architectures and techniques continually emerge, promising improved efficiency, understanding, and capabilities. This market will consider advancements in AI research, including new model architectures, training techniques, and theoretical insights that might lead to a paradigm shift. Resolve this market affirmative if, by 2028, a new architecture has been adopted in the majority of leading LLMs and is recognized by the AI research community as having supplanted transformers as the dominant architecture for state-of-the-art performance.

Get Ṁ200 play money
Sort by:
Comment hidden
Comment hidden

More related questions