
LLMs are all the rage, but Apple is notoriously tight lipped about techologies until its released as a project. Will they pretend as if LLM's don't exist?
Resolves YES if LLMs are mentioned in any capacity at the WWDC 2023 keynote (developer sessions or state of the union doesn't count). Otherwise resolves NO.
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ774 | |
2 | Ṁ180 | |
3 | Ṁ97 | |
4 | Ṁ87 | |
5 | Ṁ51 |
People are also trading
@bjubes "M2 Ultra can train large transformer models". Unclear to me if this should count or not https://youtube.com/clip/UgkxxdVpoYvFfuLNBXF4BQ4D4BoesZnLlM8z
@bjubes I didn't hear LLMs mentioned or find that phrase in the transcript. Obviously biased though.
@clayton "The keyboard now leverages a transformer language model" I think this counts
https://youtube.com/clip/Ugkxt0fR00wkOgKUj6vupyWn5Af5KaAIv2Mw
@bjubes Oh that's the same clip you were talking about. "Transformer Language Model" is the same architecture used by ChatGPT etc., so the only question is whether you consider it "Large", which is kind of subjective IMO
@bjubes depends how literal you want to be with the resolution criteria. I assumed it meant that exact phrase.
@YoavTzfati @WieDan Large isn't just an adjective, its a fundamental distinction. The keyboard transformer language model cannot be large, there isn't close to enough ram or compute on any handheld to even run such a model, and even GPT-esqe models run in datacenters are much slower (compare chatGPT output speed to autocorrect speed).
The "M2 Ultra can train large transformer models" is much closer. However, their choice to not mention large or language in this context is the whole spirit of this market. Apple is shying away from saying the buzzwords, and leaving it vague. I think their omission of saying LLM, even when its what everyone is thinking when they mention ML workloads is purposeful and keeping with their consistent ML branding over the AI hype.
@bjubes When Google presented PaLM 2 they said that the smallest version (Gecco) could be run on a phone. There are increasingly techniques to distill large language models and fine tune them with very little compute (see for example the recent QLora).
If I were Apple and wanted to take advantage of the LLM advances somehow these are the two ways I would do it - put an LLM in the IPhone, and make the macs able to train LLMs.
@YoavTzfati Another similar market resolved the keyboard one as not a LLM type model https://manifold.markets/Predict/will-apple-announce-their-own-gener?r=Ymp1YmVz
@YoavTzfati and while the Mac Pro can train models, I’m still leaning towards LLM not being “Mentioned” since they explicitly omitted the term in a natural place to bring it up. They never mentioned language at all in the mac pro section despite it being the obvious use case everyone is familiar with
@bjubes I mean "large transformer models" is pretty obviously referring to language models IMO.
But It's worth mentioning that I won't be offended if this resolves NO, I don't think there's a decisive case in either direction. And we should really have had this discussion in advance and not now
@YoavTzfati Oh another option is to get someone that didn't bet on the market to decide
@YoavTzfati I asked for a volunteer in Discord but no one offered. I am going to resolve NO as LLM's were not mentioned directly. While the nod to them was obvious, so was the nod to the metaverse, which was also avoided in similar Apple fashion. That was meant to be the spirit of the market, how Apple talks about tech hype in their marketing events.
But yeah in the future I'll not bet on Markets like this which can be subjective, I had thought this one would be pretty clear cut.
Can you please edit the question to explicitly call out WWDC’s keynote? Considering only the keynote is a very different question to WWDC in general.
I still think it’s >25% for the keynote but not by much - they’ll probably announce some ML dev tooling that hypes up Apple Silicon, but they’d might call out LLMs explicitly only in a dev session