Will AI usage in 2030 be split in tiers where the best and most expensive models are used rarely compared to cheaper and lower-quality models?

ChatGPT probably costs an average of "single-digits cents per chat" according to Sam Altman and is offered free to consumers today.

For just 3 chats per minute, that would cost on the order of ~$9 per hour (with large error bars) — approaching the level of human wages!

As large language models grow ever larger, that marginal cost could be set to increase further. I think this fact will be significant for the future of AI.

The key way I see AI doomers predicting incorrectly is failing to account for this exponentially increasing cost when speculating about exponentially increasing intelligence.

The combinatorial nature of our world implies exponentially declining returns for more intelligence. Heuristics can only get us so far, and after that, there's just a giant search space. Think of the traveling salesman problem or Monte Carlo tree search – intelligence is NP-hard.

While I think AI will be utterly transformational, progress may be slower than expected if running the most productive AI's becomes exceedingly expensive and bottlenecked by hardware.

I imagine a world with different tiers of AI used in different ways, where cheaper tiers of AI will be used more.

This market resolves based on my subjective impression of whether this prediction is correct in 2030.

Sort by:
MarkHenry avatar
Mark Henry
bought Ṁ55 of YES

Imagine a world where this resolves NO. What would that world look like?

MartinRandall avatar

Is "cheaper" purely measured in terms of execution cost? Eg, if a company takes a model and does additional work to make it faster and cheaper to run, perhaps with custom hardware or regularisation, does that count as a cheaper model or a more expensive model?

JamesGrugett avatar
is predicting YES at 71%

@MartinRandall Yes, let's say it's about only the execution cost to run the model.