
ChatGPT probably costs an average of "single-digits cents per chat" according to Sam Altman and is offered free to consumers today.
For just 3 chats per minute, that would cost on the order of ~$9 per hour (with large error bars) — approaching the level of human wages!
As large language models grow ever larger, that marginal cost could be set to increase further. I think this fact will be significant for the future of AI.
The key way I see AI doomers predicting incorrectly is failing to account for this exponentially increasing cost when speculating about exponentially increasing intelligence.
The combinatorial nature of our world implies exponentially declining returns for more intelligence. Heuristics can only get us so far, and after that, there's just a giant search space. Think of the traveling salesman problem or Monte Carlo tree search – intelligence is NP-hard.
While I think AI will be utterly transformational, progress may be slower than expected if running the most productive AI's becomes exceedingly expensive and bottlenecked by hardware.
I imagine a world with different tiers of AI used in different ways, where cheaper tiers of AI will be used more.
This market resolves based on my subjective impression of whether this prediction is correct in 2030.
Update: one sufficient condition for resolving NO is if more money is spent on top-tier AI than all the other tiers combined.