Is gpt2-chatbot based on the GPT-2 architecture?
46
271
680
resolved May 14
Resolved
NO

Resolves YES if good evidence is provided that gpt2-chatbot is based on the GPT-2 architecture/a modification of GPT-2/whatever . e.g. a statement from OpenAI

Resolves NO if no such evidence is provided prior to June 2025.

Get Ṁ600 play money

🏅 Top traders

#NameTotal profit
1Ṁ157
2Ṁ145
3Ṁ119
4Ṁ74
5Ṁ38
Sort by:
bought Ṁ2,000 NO

@jim can this resolve NO?

@DanMan314 we don't have enough information for this to resolve NO

bought Ṁ4,948 NO

@DanMan314 yeah, resolve this no

@Bair also pretty confidence this is a no

@RemNi I'm also pretty confident this is a no, but there are no formal reasons to resolve it yet.

@Bair if you're OK with a NO resolution I will resolve NO.

@jim I personally am OK with a NO resolution, but there are many other YES holders, I think they might be not OK with it.

bought Ṁ50 NO

It has the same errors with the "davidjl" token, which suggests that it is using the gpt4 tokenizer, it is also slower than the other models.

opened a Ṁ10,000 NO at 50% order

@SamForbes limit order up if you believe that!

bought Ṁ30 YES

I'm starting to think it might be, but seems crazy

@RemNi like similar in size to gpt-2, with many new tricks. Possibly also with RAG into a very large dataset of text (maybe containing lots of synthetic examples produced by gpt-4?). And then perhaps some tree of sequence search on top?

@RemNi I saw an opinion that it can't be search because search is incompatible with streaming, but incremental search with limited depth would probably still work with streaming.

sold Ṁ11 YES

@Bair yeah that's what I thought also when using it in lmsys. I think it's possible that if it's doing search then it might still be outputting tokens to the stream at a constant rate. Like it's an output buffer that lags behind the search lookahead

@Bair when a token is pushed to the stream, the search process prunes the parts of the tree that don't contain that token, something like that

@Bair ahaha no turns out it's just a big multimodal model

I'm not sure we will know by January. Maybe give it another year?

bought Ṁ100 NO

@Bair extended it to June 2025

@jim also it's obviously not a literal GPT-2, maybe it should say "modification of GPT-2"

More related questions