Full question:
AGI will not be built with an architecture 1 or 0 advances beyond transformers, where an “advance” is a singular technology no larger than moving from pre-transformers to transformers (e.g., from pairwise connections to a shared workspace). This scenario is similar to the idea of “prosaic AGI.”
One of the questions from https://jacyanthis.com/big-questions.
Resolves according to my judgement of whether the criteria have been met, taking into account clarifications from @JacyAnthis, who made those predictions. (The goal is that they'd feel comfortable betting to their credance in this market, so I want the resolution criteria to match their intention.)
Stable Diffusion, DreamerV_, etc. are all well beyond transformers
It’s obviously some mix of latent spaces, (ascending across many contexts), world models and internal simulation, etc.
Better question is whether it’s way beyond those
(xformers are 2017-era tech that happen to be ideal for language; any agi is going to need to have interfacing modules and many layers of abstracted representations above each network)