AI: Will GPT-4 have global memory?
26
560Ṁ2139
resolved Mar 17
Resolved
NO
Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ76
2Ṁ63
3Ṁ57
4Ṁ51
5Ṁ47
Sort by:
predictedNO

Blue check marks—pls re-open

May resolve yes bc it quotes most books flawlessly (and their scaling issues indicate something is far from a vanilla gpt)

predictedNO

RETVrN (to tradable prediction market)

predictedNO

Copyright lawsuit incoming

predictedNO
predictedNO
predictedNO

May resolve yes

Knows way too many obscure and non-obscure quotes

Quotes at length from random Adam Smith digressions and can recite all of Nietzsche in an exact translation

predictedNO

re open this market.

Can you give rationale for resolving this NO?

We don't know if it actually has global memory, in my opinion it probably does since it is so fast and remembers so much minutae detail. Has OAI said anywhere that it doesn't?

I'm pretty sure this should have been resolved N/A

@duck In fact this in my opinion explains quite well why they didn't provide even param count.

This should also be viewed with suspicion since @Gigacasting resolved the market such that he profited from it. He bought NO as recently as 2 days ago and then resolved the market in his favour... This seems very fishy

@duck They probably didn't provide param count because GPT-4 was trained using chinchilla scaling laws meaning the model can be much smaller but maintain high performance. And OpenAI is very PR conscious and knows that the public is dumb and will think smaller number --> worse model, so they just didn't share it at all.

@jonsimon Also possible, hard to speculate though. Still, they didn't rule out it having something akin to RETRO, so it's a possiblity, I don't understand why this market was resolved the way it was.

IMO the prediction market would be very useful as an information source even while GPT-4 is live, as similar models that compete with it come out

Not to mention that this reeks of manipulation.

predictedNO

@duck There's nothing in the their technical report about it, and it uses the same Chat Completions API where it wouldn't make sense to have global memory. We already know how ChatGPT3.5 does memory for long conversations; it probably uses that instead of some completely new and redundant mechanism.

@Gigacasting Any explanation? Markets such as https://manifold.markets/andrew/how-many-parameters-will-gpt4-have are staying open, because they are assuming the details might leak. We don't know whether it uses a RETRO-like architecture. And there is ZERO evidence towards it NOT using a RETRO-like architecture.

I just subsidized 50 mana

It's very interesting that this is predicted at so high. Anyone cares to explain some downsides of using a RETRO-like architecture?

What does this mean exactly? Having a memory separate from textual context that persists within a conversation? Between conversations?

@jonsimon oh I see the linked paper now, I think the Manifold app is bugging out, I had to open this up I'm a browser to see the description

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules