Will GPT4 have a mechanism of long-term memory (e.g a scratchpad where it can read and write)?
25
207
แน€460
resolved Mar 24
Resolved
NO

One of a series of questions copied from this Twitter thread: https://twitter.com/Simeon_Cps/status/1578814112028831744.

This resolves positively if GPT4 has some mechanism by which it can store information that doesn't require retraining or re-prompting it with whatever context you want it to factor in.

Get แน€200 play money

๐Ÿ… Top traders

#NameTotal profit
1แน€299
2แน€39
3แน€29
4แน€27
5แน€23
Sort by:
bought แน€150 of NO

I think everybody forgot about this question and there is no such mechanism. @StephenMalina

@ValeryCherepanov thanks for the bump, resolving!

predicted NO

do you require that it was trained from scratch to use the memory, or is it enough that it is prompted or fine-tuned like web-GPT?

sold แน€26 of NO

@LauroLangoscodiLangosco fine-tuned would be fine. For example, if it's something like webGPT where it's trained to retrieve and deposit from there that counts. It does have to be long term though.

predicted NO

@StephenMalina Long-term here means permanent, ie retrievable an unbounded number of tokens later?

Also, how would this market resolve if GPT-4 itself does not have this kind of memory, but OpenAI releases a model called GPT-4-memory at the same time which is just GPT-4 fine-tuned to use memory?

predicted NO

@StephenMalina curious for your answer to this, since it seems pretty likely to me that there will be some fine-tuned variant of GPT-4 that will have memory, while it's much less likely that the base model will.

@Lauro if there's a fine tuned variant that's the same # of parameters, available, and has memory then I'll resolve yes.

@Lauro to your first question, I don't know about unbounded and I assume the memory won't be infinite. Something like memorizing transformers would count. I also would count if it's not globally permanent but per user or something.

bought แน€10 of NO

My guess is memorization as a discrete separate entity built on top of gpt4

How lossless does the long-term memory need to be? If it can only access some representation of past conversations does that still count?

@vluzko yes that counts. E.g., something like memorizing transformers would count as would another way of a model storing arbitrary activations or embeddings and retrieving them.

predicted YES

If they are smart (not always a guarantee with OpenAI), they will give it a ~1m token memory

<$2 to read a book and then be able to answer questions about it would be the demo of the half-decade

@Gigacasting This would actually be a great idea!

bought แน€70 of YES

Best paper of the year