Will GPT4 have a mechanism of long-term memory (e.g a scratchpad where it can read and write)?
58%
chance

One of a series of questions copied from this Twitter thread: https://twitter.com/Simeon_Cps/status/1578814112028831744.

This resolves positively if GPT4 has some mechanism by which it can store information that doesn't require retraining or re-prompting it with whatever context you want it to factor in.

Sort by:
Lauro avatar
Lauro Langosco
is predicting NO at 44%

do you require that it was trained from scratch to use the memory, or is it enough that it is prompted or fine-tuned like web-GPT?

Lauro avatar
Lauro Langosco
sold Ṁ26 of NO
StephenMalina avatar

@LauroLangoscodiLangosco fine-tuned would be fine. For example, if it's something like webGPT where it's trained to retrieve and deposit from there that counts. It does have to be long term though.

Lauro avatar
Lauro Langosco
is predicting NO at 47%

@StephenMalina Long-term here means permanent, ie retrievable an unbounded number of tokens later?

Also, how would this market resolve if GPT-4 itself does not have this kind of memory, but OpenAI releases a model called GPT-4-memory at the same time which is just GPT-4 fine-tuned to use memory?

Lauro avatar
Lauro Langosco
is predicting NO at 52%

@StephenMalina curious for your answer to this, since it seems pretty likely to me that there will be some fine-tuned variant of GPT-4 that will have memory, while it's much less likely that the base model will.

StephenMalina avatar

@Lauro if there's a fine tuned variant that's the same # of parameters, available, and has memory then I'll resolve yes.

StephenMalina avatar

@Lauro to your first question, I don't know about unbounded and I assume the memory won't be infinite. Something like memorizing transformers would count. I also would count if it's not globally permanent but per user or something.

WarlockTiny avatar
Warlock Tiny
bought Ṁ10 of NO

My guess is memorization as a discrete separate entity built on top of gpt4

vluzko avatar

How lossless does the long-term memory need to be? If it can only access some representation of past conversations does that still count?

StephenMalina avatar

@vluzko yes that counts. E.g., something like memorizing transformers would count as would another way of a model storing arbitrary activations or embeddings and retrieving them.

Gigacasting avatar
Gigacasting
is predicting YES at 74%

If they are smart (not always a guarantee with OpenAI), they will give it a ~1m token memory

<$2 to read a book and then be able to answer questions about it would be the demo of the half-decade

SneakySly avatar

@Gigacasting This would actually be a great idea!

Gigacasting avatar
Gigacasting
bought Ṁ70 of YES

Best paper of the year