One of a series of questions copied from this Twitter thread: https://twitter.com/Simeon_Cps/status/1578814112028831744.
This resolves positively if GPT4 has some mechanism by which it can store information that doesn't require retraining or re-prompting it with whatever context you want it to factor in.
Related questions
๐ Top traders
# | Name | Total profit |
---|---|---|
1 | แน299 | |
2 | แน39 | |
3 | แน29 | |
4 | แน27 | |
5 | แน23 |
I think everybody forgot about this question and there is no such mechanism. @StephenMalina
@LauroLangoscodiLangosco fine-tuned would be fine. For example, if it's something like webGPT where it's trained to retrieve and deposit from there that counts. It does have to be long term though.
@StephenMalina Long-term here means permanent, ie retrievable an unbounded number of tokens later?
Also, how would this market resolve if GPT-4 itself does not have this kind of memory, but OpenAI releases a model called GPT-4-memory at the same time which is just GPT-4 fine-tuned to use memory?
@StephenMalina curious for your answer to this, since it seems pretty likely to me that there will be some fine-tuned variant of GPT-4 that will have memory, while it's much less likely that the base model will.
@Lauro if there's a fine tuned variant that's the same # of parameters, available, and has memory then I'll resolve yes.
@Lauro to your first question, I don't know about unbounded and I assume the memory won't be infinite. Something like memorizing transformers would count. I also would count if it's not globally permanent but per user or something.
@vluzko yes that counts. E.g., something like memorizing transformers would count as would another way of a model storing arbitrary activations or embeddings and retrieving them.