Will GPT-4 be available to ChatGPT Free Users in 2024?
444
2.5K
3.1K
Dec 31
71%
chance
@Soli i think a weekend is too little but if it is a recurrent thing so multiple weekends then i would say it counts - if it is more than or equal to 4 weeks it should count - anything else would be too little and i would resolve 50%
Jan 15
@Soli I think it's pretty unlikely, but think 4 weeks sounds like more than enough to count, definitely wouldn't complain about that. You may wish to specify whether the four weeks need to be contiguous, and if so, whether it's the start of the 4 weeks or the end that needs to be in 2024. It should probably be the start, since otherwise the market effectively gets shortened by 4 weeks. Edit: sounds like you're going for not contiguous. In that case you'd have the odd corner case of GPT-4 becoming permanently free starting Dec 5th, and yet this market resolving NO. You might want to adjust things to avoid that strange outcome.
Jan 15
@chrisjbillington most paid apps allow users a trial period where they can test the product under the assumption they would stay - if someone has to enter a credit card to get these 4 weeks and is automatically charged once the trial is over then this would not count towards resolving this market as yes since in my eyes these are paid users who just got 1 month free of charge.
Jan 15

Free Access Conditions: The bet resolves YES if OpenAI offers GPT-4 (including variants like "GPT-4-lite" or similar) for free to ChatGPT users in 2024 under any of these conditions:

  • Permanent free access with limited usage (e.g., capped messages)

  • Limited trial access totaling four weeks (not necessarily consecutive).

  • Access through custom GPTs using GPT-4 on the GPT-Store.


Exclusion Criteria

  • Trials requiring credit card information and leading to automatic charges after the trial do not count towards this bet's resolution.

  • Access to GPT-4 through other platforms like Bing does not count.


Edge Cases

  • If a trial that doesn't lead to automatic charges starts in December and ends in January 2025, it counts towards resolving this market as yes.

  • If OpenAI requires a payment method for verification and doesn't apply automatic charges, then this would count toward resolving this question as Yes. However, this is highly unlikely since they use phone numbers for verification.


Get Ṁ200 play money
Sort by:

I am confused why this is so low. We are in April and Anthropic is already offering a GPT-4 class model (Sonnet) for free and they are worse-funded. Do the no-voters believe that OpenAI is going to cede the chatbot market? This doesn't make sense from a business standpoint and it is at odds with Altman's recent statements on their mission:

> We don’t run ads on our free version. We don’t monetize it in other ways. We just say it’s part of our mission. We want to put increasingly powerful tools in the hands of people for free and get them to use them. I think that kind of open is really important to our mission. I think if you give people great tools and teach them to use them or don’t even teach them, they’ll figure it out, and let them go build an incredible future for each other with that, that’s a big deal. So if we can keep putting free or low cost or free and low cost powerful AI tools out in the world, I think that’s a huge deal for how we fulfill the mission.

https://lexfridman.com/sam-altman-2-transcript/

I would try to buy this up to 95% if the question were "Will a model at least as good at GPT-4 be available to ChatGPT free users in 2024". But! I put some probability (10-15%?) that they go straight to GPT-5 and tier it the same way Anthropic did with Claude.

@WillSorenson For me this market is more about naming convention than performance. If openAI is insistent that the X in gpt-X has to be mapped to parameter count, then I am still biased to vote no, as I find it unlikely that they would be able to cost-effectively service a near 1T parameter model to the quantity of users they currently have.

If they skipped 4 and went straight to 4.5 for free users, this would resolve yes?

@GabeGarboden I think it should still resolve, yes but I would love to hear @chrisjbillington or @DavidBolin's opinion

The bet resolves YES if OpenAI offers GPT-4 (including variants like "GPT-4-lite" or similar) for free to ChatGPT users in 2024

@Soli I wouldn't object. 4.5 (if it happens) being free is a high bar still, and seems close enough.

bought Ṁ200 NO

@Soli Fine with that. Neither 4 nor 4.5 is going to be free.

bought Ṁ30 of YES

Remember that Microsoft is already giving GPT-4 for free to Microsoft Copilot users.

bought Ṁ35 of YES

@esusatyo

Access to GPT-4 through other platforms like Bing does not count.

bought Ṁ50 of YES

@Soli Yes I know. I’m invalidating the argument by others in the comment that it’s “too expensive”.

bought Ṁ100 of YES

@Soli His point is that if Microsoft is *already* doing it, everyone else likely will by the end of the year.

bought Ṁ500 of NO

@KwameOsei I don't think so. Bing has much lower usage than openAI, they might be burning money on it, but not enough for it to matter. Not everyone can do that.

bought Ṁ100 of YES

@chrisjbillington I mean, Microsoft is a major investor of OpenAI… technically it’s Microsoft’s money to burn as well.

@esusatyo It doesn't invalidate it. Microsoft is paying for it, and it just shows how rich they are.

@esusatyo Microsoft is not giving gpt-4 for free to Copilot users.

@pietrokc What do you mean? there is a toggle to turn on GPT-4 and you don’t need to pay

@DylanSlagh i think he means GitHub Co-Pilot, where users have to pay ~20$ per month

bought Ṁ60 of YES

This market gives it a 69% chance that OpenAI would release GPT-5 before 2025

bought Ṁ100 NO from 55% to 52%
bought Ṁ100 of NO

@Soli They are not going to make GPT-4 free even if they release a GPT-5.

If that happened plenty of current GPT-4 subscribers would stop paying, and would not pay for GPT-5 either because they could just expect to get it free later.

And GPT-4 is very expensive for them, not something they are just going to start giving away.

They are not going to use that business model, because it isn't one. The price here is absurd.

predicts YES

@DavidBolin read the description again please so that you don’t end up complaining later

predicts NO

@Soli The trial?

That isn't going to happen either. They don't have any developers for that sort of thing.

You can't even search your own conversations or even the titles of the conversations, something that a single developer should be able to implement in 30 minutes or less.

bought Ṁ2 YES
predicts YES

@DavidBolin come on, hahaha. Building a search for an app being used by more than 100 million people will not take a single developer 30min 😅, but this is beside the point anyway. I just wanted to make sure you read the description, and you did, so thank you. We shall see what happens 😉

I won't refute your points because I still want to increase my Yes position, but once I feel like I have enough, I will get back to this.

bought Ṁ500 NO from 58% to 47%
predicts NO

@Soli You are searching your own chats. At least the headlines are in browser memory. Making a search bar for those would indeed take 30 minutes or less (maybe 5 minutes.)

bought Ṁ500 of NO

(And even the backend query to search all the text from all your chats would take less than 30 minutes to build and almost certainly would be plenty fast enough for the actual usage it would get.)

bought Ṁ50 YES from 51% to 52%
bought Ṁ60 of YES

@Soli maybe it doesn't take 30mins because it also has to implement into the ui/etc, but it's a pretty easy feature to implement lol

bought Ṁ60 of YES

@DanielWallace it most definitely won’t take 30min and is certainly doable but i think the complexity of building search that actually works fast and reliably is being hugely underestimated here. Also there is non-significant cost to build something like that and would distract from focusing on other more important features.

bought Ṁ60 of YES

@DavidBolin gpt4-turbo is 1/10 the price of normal gpt4 lol
and gpt4-turbo can be used for free @ copilot/microsoft/bing

bought Ṁ60 of YES

@Soli I'm a programmer, it is not hard at all, their api responds with json
searching thru hundreds of chats/messages takes almost zero effort
I also built discordbots with the api and implemented chat history,adding search to this is beyond simple, I hope you're talking from actual experience
(and i run local models)

bought Ṁ35 of YES

@DanielWallace I built my own search for ChatGPT with a custom Chrome extension and a search feature for my Arabic AI app that uses Algolia. Still, building a search for an app with the scale of ChatGPT is not trivial, especially if we want them to add stuff such as embeddings that would make the search experience much better. It won't take 30 minutes and has a non-trivial cost, which I am sure you will agree. I never built an app that over 100 million active users use. All my apps have less than 100k users, so I don't feel my experience, or yours, can generalized here. Generally, I don't particularly appreciate when people oversimplify stuff and act as if they know better when there must be valid reasons OpenAI decided not to add search to this point. They will probably add it very soon.

bought Ṁ100 of NO

@Soli The valid reason is that they are not doing much work on frontend, like I said. Nor will they do the work to offer a free trial.