Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
20
137
390
2025
26%
chance

Finished training.

Get Ṁ200 play money
Sort by:

@firstuserhere How does this resolve if inflection doesn't exist at close time? I'm assuming No? Because they won't "have a model"

@FergusArgyll that's right of course. No company anymore probably means no model which means a resolution to no

bought Ṁ100 NO

what is this even?!?! 10x gpt4?!

@FergusArgyll I haven't done a deep dive on Inflection's road map, but my general vibes say:

  1. They have a bunch of available compute, name recognition, and investment.

  2. From what I remember on the Mustafa Suleyman 80k hours podcast from a bunch of months ago, they are planning on giant (10x, 100x?) GPT 4 models soon.

  3. They don't seem to care about responsible scaling as much as other orgs (OAI, Anthropic, Deepmind).

  4. YoY, line goes up on LLM param count, exponentially.

All of these are low-mid confidence - I didn't research much before betting, but it's enough for me to be willing to make small bets up to around 45-50%.

@FergusArgyll haha, my comment sure aged well.

@RobertCousineau Ya, now I have to wait a year before this affects my calibration. That's the thing I hate most about this site

bought Ṁ65 of YES

https://youtu.be/9hscUFWaBvw?t=168

The CEO of InflectionAI announced they would have a x10 and then a x100 larger model in the next 18 months, and they can probably pull it off with their 20k H100 cluster.

What is unclear here is: are we talking about the number of parameters (I assume yes) or flops used for training (I assume no, but it would be a more meaningful metric)?

@NielsW you assume right and if you'd like to make the market for flops, tag me please :)

bought Ṁ20 of NO

GPT-4 is rumored to have over 1T parameters, and assuming that's true, this market is about a >10T parameter model. I don't think anyone will bother training a model of that size in the next 1.5 years, because there are so many other promising research directions for making better LLMs - network architectures, dataset construction, training methodologies, etc.