
This market is intended to resolve to the bucket containing the LLM generation that Taelin agrees solved his "Modified Perfect Binary Tree".
A summary of the prompt he used is:
Consider the problem of inverting a perfect binary tree. That's an old, entry-level interview question that humans and LLM's can solve easily. Now, let's add just 3 key nuances to make it new and unique:
1. It must invert keys ("bit-reversal permutation")
2. It must be a dependency-free, pure recursive function
3. It must have type
Bit -> Tree -> Tree
The precise definition of this is here, explained in Typescript and Agda.
He even offers you 10k USD if you are the first to do this.
Taelin has caveated that if a model is no longer based on transformers, his bet no longer stands.
I leave the interpretation of what era contains which models up to myself or the Manifold moderators/admins. Era's should be defined by a combination of performance, release date, and pre-training compute.
As a baseline/for help on categorization, I'd say the following:
GPT-4 Era:
GPT-4 and its variants (e.g., GPT-4 Turbo, GPT-O1)
Claude 3 family (Haiku, Sonnet, Opus)
Gemini 1.5 and potential future iterations
Llama 3 and its variants
GPT-3 Era:
GPT-3 and its variants (e.g., davinci)
GPT-3.5 (including ChatGPT)
Claude 1 and Claude 2
Llama 2
All resolution criteria may be clarified/corrected if it turns out I made a mistake saying something. Mods/Admins are welcome to state disapproval of and/or roll back my decision if they think it is unreasonable.