6) An alternative to the transformer architecture will see meaningful adoption.
59
1kṀ17k
resolved Dec 17
Resolved
YES

All these predictions are taken from Forbes/Rob Toews' "10 AI Predictions For 2024".
For the 2023 predictions you can find them here, and their resolution here.
You can find all the markets under the tag [2024 Forbes AI predictions].

  • I will resolve to whatever Forbes/Rob Toews say in their resolution article for 2024's predictions.

  • I might bet in this market, as I have no power over the resolution.


Description of this prediction from the article:
Introduced in a seminal 2017 paper out of Google, the transformer architecture is the dominant paradigm in AI technology today. Every major generative AI model and product in existence—ChatGPT, Midjourney, GitHub Copilot and so on—is built using transformers.

But no technology remains dominant forever.

On the edges of the AI research community, a few groups have been hard at work developing novel, next-generation AI architectures that are superior to transformers in different ways.

One key hub of these efforts is Chris Ré’s lab at Stanford. The central theme of Ré and his students’ work has been to build a new model architecture that scales sub-quadratically with sequence length (rather than quadratically, as transformers do). Sub-quadratic scaling would enable AI models that are (1) less computationally intensive and (2) better able to process long sequences compared to transformers. Notable sub-quadratic model architectures out of Ré’s lab in recent years have included S4, Monarch Mixer and Hyena.

The most recent sub-quadratic architecture—and perhaps the most promising yet—is Mamba. Published just last month by two Ré protégés, Mamba has inspired tremendous buzz in the AI research community, with some commentators hailing it as “the end of transformers.”

Other efforts to build alternatives to the transformer architecture include liquid neural networks, developed at MIT, and Sakana AI, a new startup led by one of the co-inventors of the transformer.

Next year, we predict that one or more of these challenger architectures will break through and win real adoption, transitioning from a mere research novelty to a credible alternative AI approach used in production.

To be clear, we do not expect transformers to go away in 2024. They are a deeply entrenched technology on which the world’s most important AI systems are based. But we do predict that 2024 will be the year in which cutting-edge alternatives to the transformer become viable options for real-world AI use cases.

Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ1,272
2Ṁ70
3Ṁ65
4Ṁ49
5Ṁ29
Sort by:

Since this seems like one of the resolutions that most upset people, I'm posting the motivation from Toew's Forbes article below:

"The transformer remains the dominant AI architecture today, by far. But 2024 proved to be the year that, to quote last year’s article, 'a challenger architecture broke through and won real adoption, transitioning from a mere research novelty to a credible alternative AI approach used in production.'

That alternative architecture is the state space model (SSM).

Mamba, today’s most prominent state space model, has been downloaded hundreds of thousands of times on Hugging Face since its publication about a year ago. Mamba has inspired a number of variants that are in wide use today, from Vision Mamba to Mixture-of-Experts Mamba to MambaByte. As one example, well-funded Israeli startup AI21 Labs built its flagship model (named Jamba) on the Mamba architecture.

Cartesia, a young startup out of Chris Ré’s Stanford lab focused on productizing and commercializing SSMs, has seen significant growth this year. Its generative audio models—built on the SSM architecture—have emerged as a serious challenger to industry leaders ElevenLabs and OpenAI thanks to their superior efficiency, latency and ability to handle long inputs.

(Other challenger architectures also made progress this year—for instance liquid neural networks—but none have yet achieved the real-world adoption that state space models have.)"

No explanation for the market resolution and the market creator made the most profit. Hmmm.

EDIT: I see the Forbes post now, the resolution is fair.

@zQ4Z82W I made about 400 of the profits on resolution today, most of it was based on previous predictions that he would count Mamba 💅 :)

I think Mamba has crossed the threshold of "meaningful adoption" at this point

bought Ṁ200 NO

How much is significant?

© Manifold Markets, Inc.TermsPrivacy