Free Access Conditions: The bet resolves YES if OpenAI offers GPT-4 (including variants like "GPT-4-lite" or similar) for free to ChatGPT users in 2024 under any of these conditions:
Permanent free access with limited usage (e.g., capped messages)
Limited trial access totaling four weeks (not necessarily consecutive).
Access through custom GPTs using GPT-4 on the GPT-Store.
Exclusion Criteria
Trials requiring credit card information and leading to automatic charges after the trial do not count towards this bet's resolution.
Access to GPT-4 through other platforms like Bing does not count.
Edge Cases
If a trial that doesn't lead to automatic charges starts in December and ends in January 2025, it counts towards resolving this market as yes.
If OpenAI requires a payment method for verification and doesn't apply automatic charges, then this would count toward resolving this question as Yes. However, this is highly unlikely since they use phone numbers for verification.
Related questions
I am confused why this is so low. We are in April and Anthropic is already offering a GPT-4 class model (Sonnet) for free and they are worse-funded. Do the no-voters believe that OpenAI is going to cede the chatbot market? This doesn't make sense from a business standpoint and it is at odds with Altman's recent statements on their mission:
> We don’t run ads on our free version. We don’t monetize it in other ways. We just say it’s part of our mission. We want to put increasingly powerful tools in the hands of people for free and get them to use them. I think that kind of open is really important to our mission. I think if you give people great tools and teach them to use them or don’t even teach them, they’ll figure it out, and let them go build an incredible future for each other with that, that’s a big deal. So if we can keep putting free or low cost or free and low cost powerful AI tools out in the world, I think that’s a huge deal for how we fulfill the mission.
https://lexfridman.com/sam-altman-2-transcript/
I would try to buy this up to 95% if the question were "Will a model at least as good at GPT-4 be available to ChatGPT free users in 2024". But! I put some probability (10-15%?) that they go straight to GPT-5 and tier it the same way Anthropic did with Claude.
@WillSorenson For me this market is more about naming convention than performance. If openAI is insistent that the X in gpt-X has to be mapped to parameter count, then I am still biased to vote no, as I find it unlikely that they would be able to cost-effectively service a near 1T parameter model to the quantity of users they currently have.
@GabeGarboden I think it should still resolve, yes but I would love to hear @chrisjbillington or @DavidBolin's opinion
The bet resolves YES if OpenAI offers GPT-4 (including variants like "GPT-4-lite" or similar) for free to ChatGPT users in 2024
@Soli I wouldn't object. 4.5 (if it happens) being free is a high bar still, and seems close enough.
@Soli Yes I know. I’m invalidating the argument by others in the comment that it’s “too expensive”.
@Soli His point is that if Microsoft is *already* doing it, everyone else likely will by the end of the year.
@KwameOsei I don't think so. Bing has much lower usage than openAI, they might be burning money on it, but not enough for it to matter. Not everyone can do that.
@chrisjbillington I mean, Microsoft is a major investor of OpenAI… technically it’s Microsoft’s money to burn as well.
@esusatyo It doesn't invalidate it. Microsoft is paying for it, and it just shows how rich they are.
@Soli They are not going to make GPT-4 free even if they release a GPT-5.
If that happened plenty of current GPT-4 subscribers would stop paying, and would not pay for GPT-5 either because they could just expect to get it free later.
And GPT-4 is very expensive for them, not something they are just going to start giving away.
They are not going to use that business model, because it isn't one. The price here is absurd.
@DavidBolin read the description again please so that you don’t end up complaining later
@Soli The trial?
That isn't going to happen either. They don't have any developers for that sort of thing.
You can't even search your own conversations or even the titles of the conversations, something that a single developer should be able to implement in 30 minutes or less.
@DavidBolin come on, hahaha. Building a search for an app being used by more than 100 million people will not take a single developer 30min 😅, but this is beside the point anyway. I just wanted to make sure you read the description, and you did, so thank you. We shall see what happens 😉
I won't refute your points because I still want to increase my Yes position, but once I feel like I have enough, I will get back to this.
@Soli You are searching your own chats. At least the headlines are in browser memory. Making a search bar for those would indeed take 30 minutes or less (maybe 5 minutes.)
@Soli maybe it doesn't take 30mins because it also has to implement into the ui/etc, but it's a pretty easy feature to implement lol
@DanielWallace it most definitely won’t take 30min and is certainly doable but i think the complexity of building search that actually works fast and reliably is being hugely underestimated here. Also there is non-significant cost to build something like that and would distract from focusing on other more important features.
@DavidBolin gpt4-turbo is 1/10 the price of normal gpt4 lol
and gpt4-turbo can be used for free @ copilot/microsoft/bing
@Soli I'm a programmer, it is not hard at all, their api responds with json
searching thru hundreds of chats/messages takes almost zero effort
I also built discordbots with the api and implemented chat history,adding search to this is beyond simple, I hope you're talking from actual experience
(and i run local models)
@DanielWallace I built my own search for ChatGPT with a custom Chrome extension and a search feature for my Arabic AI app that uses Algolia. Still, building a search for an app with the scale of ChatGPT is not trivial, especially if we want them to add stuff such as embeddings that would make the search experience much better. It won't take 30 minutes and has a non-trivial cost, which I am sure you will agree. I never built an app that over 100 million active users use. All my apps have less than 100k users, so I don't feel my experience, or yours, can generalized here. Generally, I don't particularly appreciate when people oversimplify stuff and act as if they know better when there must be valid reasons OpenAI decided not to add search to this point. They will probably add it very soon.
@Soli The valid reason is that they are not doing much work on frontend, like I said. Nor will they do the work to offer a free trial.