
All these predictions are taken from Forbes/Rob Toews' "10 AI Predictions For 2025".
For the 2024 predictions you can find them here, and their resolution here.
You can find all the markets under the tag [2025 Forbes AI predictions].
Note that I will resolve to whatever Forbes/Rob Toews say in their resolution article for 2025's predictions, even if I or others disagree with his decision.
I might bet in this market, as I have no power over the resolution.
Description of this prediction from the article:
In 2023, the critical physical resource that bottlenecked AI growth was GPU chips. In 2024, it has become power and data centers.
Few storylines have gotten more play in 2024 than AI’s enormous and fast-growing energy needs amid the rush to build more AI data centers. After remaining flat for decades, global power demand from data centers is projected to double between 2023 and 2026 thanks to the AI boom. In the U.S., data centers are projected to consume close to 10% of all power by 2030, up from just 3% in 2022.

The demand for energy to power AI data centers is skyrocketing. Our energy systems are not prepared.
Image source: Semianalysis
Today’s energy system is simply not equipped to handle the tremendous surge in demand coming from artificial intelligence workloads. A historic collision between these two multi-trillion-dollar systems—our energy grid and our computing infrastructure—is looming.
Nuclear power has gained momentum this year as a possible solution to this Gordian knot. Nuclear represents an ideal energy source for AI in many ways: it is zero-carbon, available 24/7 and effectively inexhaustible. But realistically, new nuclear energy sources won’t be able to make a dent in this problem until the 2030s, given long research, project development and regulatory timelines. This goes for traditional nuclear fission power plants, for next-generation “small modular reactors” (SMRs) and certainly for nuclear fusion power plants.
Next year, an unconventional new idea to tackle this challenge will emerge and attract real resources: putting AI data centers in space.
AI data centers in space—at first blush, this sounds like a bad joke about a VC trying to combine too many startup buzzwords. But there may in fact be something here.
The biggest bottleneck to rapidly building more data centers on Earth is accessing the requisite power. A computing cluster in orbit can enjoy free, limitless, zero-carbon power around the clock: the sun is always shining in space.
Of course, plenty of practical challenges remain to be solved. One obvious issue is whether and how large volumes of data can be moved cost-efficiently between orbit and Earth. This is an open question, but it may prove solvable, with promising work underway using lasers and other high-bandwidth optical communications technology.
A buzzy startup out of Y Combinator named Lumen Orbit recently raised $11 million to pursue this exact vision: building a multi-gigawatt network of data centers in space to train AI models.
As Lumen CEO Philip Johnston put it: “Instead of paying $140 million for electricity, you can pay $10 million for a launch and solar.”
Lumen will not be the only organization taking this concept seriously in 2025.
Other startup competitors will emerge. Don’t be surprised to see one or more of the cloud hyperscalers launch exploratory efforts along these lines as well. Amazon already has extensive experience putting assets into orbit via Project Kuiper; Google has a long history of funding moonshot ideas like this; even Microsoft is no stranger to the space economy. Elon Musk’s SpaceX could conceivably make a play here, too.
Update 2025-11-12 (PST) (AI summary of creator comment): Partial resolution framework:
If Rob Toews uses language like "Right-ish" or "Wrong-ish", the market will resolve to 75% PROB or 25% PROB respectively
Similar language ("mostly right", "nearly there", "in the right direction", or negative equivalents) will be treated the same way
If the resolution is ambiguous ("hard to say", "some right, some wrong", or other 50% answers), the market will resolve to N/A
Update 2025-11-12 (PST) (AI summary of creator comment): In cases where resolution is ambiguous or not clean cut, the creator may outsource resolution to Manifold moderators rather than resolving themselves.
People are also trading
I don't think it has happened in the last series of predictions (2023 and 2024 versions linked above), but since this market is getting bigger I think it's good if we can agree on expectations for partial resolutions. The below (slightly modified) was what we agreed upon for another market, does anyone have issue with that?
In 2023 resolutions, he used "Right-ish" to grade some of his predictions. In cases of a similar "Right-ish" (or "Wrong-ish") answer this year, I will resolve to 75% PROB or 25% PROB, respectively. This will apply for similar language too ("mostly right", "nearly there", "in the right direction", and vice versa negatively framed). If he says something like "hard to say" or "some right, some wrong", or anything else that feels like a cop-out or 50% answer, I will just call that N/A.
@HenriThunberg FWIW I'm also happy to outsource resolution to mods in such less than clean cut resolution cases.
@Magnify I also have 10% of my mana in YES shares for this market
But my total mana is 20K, not 303K 😂
It seems to me that we have:
4 pages of people discussing whether AI datacenters in space make sense, and
2 or 3 traders guessing whether Forbes will publicly evaluate their forecast as Wrong, in a topic with quite subjective resolution criteria and several public announcements confirming their forecast
@HenriThunberg I think your reward could be buying at 25% instead of 50+. I know I’d put down no there
@MiguelLM eh Forbes writes themselves as incorrect plenty. I don’t think they would say this is a true prediction at this point unless something sensational happened.
The element of ambiguity via Forbes in my estimate pushes this from a 5% value to maybe 15%.
I think I have roughly 20% of my mana in NO, I’m pretty new my net is 7K
@Magnify I would say Forbes wrote:
several paragraphs of introduction, to give context
three sentences of forecast
a couple of ideas of who could make the forecast become Right
their forecast, in my opinion, is
Lumen will not be the only organization taking this concept seriously in 2025.
Other startup competitors will emerge. Don’t be surprised to see one or more of the cloud hyperscalers launch exploratory efforts along these lines as well.
some ideas of who could make the forecast Rigth are
Amazon already has extensive experience putting assets into orbit via Project Kuiper; Google has a long history of funding moonshot ideas like this; even Microsoft is no stranger to the space economy. Elon Musk’s SpaceX could conceivably make a play here, too.
What we ended up having in 2025 was:
second round of investment for Lumen (now called StarCloud), with successful satellite launch
tomorrow.io launched and announced by Forbes as An AI Space Company Is Born
Project Suncatcher launched by Google, and announced by CEO
These facts should meet the threshold for both:
"Other startup competitors will emerge", and
"one or more of the cloud hyperscalers launch exploratory efforts"
I would say we are more in the 60-75% range than in 5-15%.
@Magnify if you search for AI in space articles published by Forbes in 2025, all of them seem to be more bullyish than bearish.
If we expect some consistency, we should be surprised with a year of confirmatory headlines resulting in a Wrong outcome.
This is the first time I bet on Forbes forecasts, and I haven't read in detail previous years' evaluations, so take my words with a big graint of salt.
@MiguelLM a lot of these don’t count as data centres though. Tomorrow.io for example is data collection for earth based data centres. Project suncatcher does seem like it could lead to one, but reading the google article they have on it is more like “we will collect energy in space …. AI!!”. That is to say their actual goal is practical R&D and all ai mentions are to hype investors, but I can see how Forbes would jump on the hype train like that.
I did some glancing at previous predictions and they seem willing to call themselves wrong even when there’s a bit of ambiguity here and there, but the landscape is changing I will grant.
@Magnify valid points, thanks for pointing them out.
They would help me update my odds to 40-60%
I still don't see the 5-15%. 1 out of 20, 1 out of 6 ... they should come with stronger evidence in my view.
@Magnify
on your assumption
Project suncatcher does seem like it could lead to one, but reading the google article they have on it is more like “we will collect energy in space …. AI!!”
This is how I understand their article:
headline: "Exploring a space-based, scalable AI infrastructure system design"
sub-headline: "Project Suncatcher is a moonshot exploring a new frontier: equipping solar-powered satellite constellations with TPUs and free-space optical links to one day scale machine learning compute in space."
in the article they review several challenges to make it happen, such as how to send data, whether the hardware will resist, etc.
for sure the energy is a big piece, because the main argument of moving to space is the supposed energy benefits, so it has to be the central part of the evaluation
I'm talking about this blog post: Exploring a space-based, scalable AI infrastructure system design
@HenriThunberg
> @Magnify haha yeah there should be some badge/reward for when the odds are so lopsided.
Call it the Krantz Award.
This prediction seems to be mostly be that we will get space datacenters because of energy requirements, but the thing is the problem already has a solution: Solar + Battery.
This dwarkesh interview talks about it.
https://www.youtube.com/watch?v=3cDHx2_QbPE
'Casey Handmer 00:33:33
Let's get concrete here for a second. Let's say you’ve got one rack and it's 1 megawatt. I'll leave the cooling to someone who specializes in air conditioners, but it's basically throwing air conditioners at the problem. Then you have batteries.
So in order to get four nines of uptime on this… In South Texas, you actually need less than this. But let's just say it’s 24 hours worth of battery storage. That means it'll get you through two bad nights in a row, basically. Actually, it turns out that you can significantly decrease power consumption with a very small reduction in overall compute. So if you've got like three really bad days in a row or something, you can dial back your power usage quite a lot without compromising your inference or training.
Okay, so you've got, say, a Tesla Megapack, something like four megawatt hours. So one megawatt rack, and then six Tesla Megapacks, each of which is roughly one truckload worth of stuff. So one truckload worth of rack, and then like six truckloads worth of batteries. Then in order to operate this at an average power of 1 megawatt, your solar arrays in Texas will be something like 25% utilization. So on average, if the sun came up every day and the day was the same length all the time, you would need 4 megawatts of solar arrays, which is about 4 acres of land. But in practice, because you're aiming for four nines instead of one nine, you need an overbuild of about 2.5x. So you've got about 10 acres of solar.
So 10 acres of solar, six truckloads of batteries, one truckload of data center, and some cooling stuff."
So for 99.99 uptime you simply build 2.5x the solar + batteries you need on a nice sunny day and just sell off the extra power. And if the weather is really bad just get some trucks to ship batteries in from somewhere its sunny.
Thanks to launches costing $$$$/kg this actually ends up cheaper then space energy, and that's before getting into the massive and comical headaches putting the datacenters into space would cause (maintenance, heating, upgrading compute, launch costs for everything else).
---
https://research.google/blog/exploring-a-space-based-scalable-ai-infrastructure-system-design/
"Historically, high launch costs have been a primary barrier to large-scale space-based systems. However, our analysis of historical and projected launch pricing data suggests that with a sustained learning rate, prices may fall to less than $200/kg by the mid-2030s. At that price point, the cost of launching and operating a space-based data center could become roughly comparable to the reported energy costs of an equivalent terrestrial data center on a per-kilowatt/year basis[1]. See the preprint paper for more details."
Google estimates that it will take till the mid 2030s with prices falling over an OOM before energy costs equalize, (and presumably that makes the "nothing ever happens" assumption that solar prices stop falling tomorrow for no reason).
So in a decade? Sure, we might get them, but space AI datacenters are nowhere near economically viable now or in the near future, even if you ignore all the other issues.
@lemon10 Casey Handmer has no clue what he's talking about half the time. Even in that video he starts out by saying that high speed trains are poor capital allocation and China has no mountains separating it from surrounding great powers.
@lemon10 also the world isn't just the USA. Japan's going to struggle to roll out massive solar powered datacenters in Japan. They have the capital and the incentive to push harder for something like AI datacenters in space
Technically everything is “in space” so all future worldly datacenter projects are also in space
Ok, to spell this out, mirrors can be put in space to redirect light for solar power on earth, and satellites can generate power for themselves by solar and onboard sources. However unlike a small computer on a spacecraft, a datacenter cannot be put in space profitably because it is functionally impossible to deal with or redirect the heat it has to generate enough for the idea to make sense, you don't have convection cooling because you don't have an atmosphere, you only have radiative cooling. And obviously the massive upfront cost makes it make even less sense vs somewhere with availability infrastructure, for example the US, or somewhere where electricity is nearly free, for example Iceland, or to just build a power-station into the design, which some groups are not doing. Those are not the only problems with putting a great big datacenter in space. Remember that there are actually sensible plans to put manufacturing in space, something that not only can actually work, but that is inevitable in time. Despite all the SpaceX hype they can barely launch successfully, cannot launch any payload that weighs anything, are enormously behind half a century old tech in terms of practicality, cost the taxpayer dozens of times more than it took to land on the moon the first time, and have a history of making rookie mistakes and firing people who pointed out issues only for those issues to cause catastrophes. Even countries and organisations that actually do sensible space stuff have been struggling with problems that previously were solved. Would this have even been an argument earlier this century, but right now it's a money spinner to pretend like "lets go raise the wreck of the titanic" or "lets launch a datacenter into space", but that's not the same as a serious idea.
@AlanTennant you’re answering the question “is it rational,” not “will stupid investors try it anyway” or “will media frame it this way for clicks”
@Dulaman ehhhh I think this is talking about starlink with regards to the statistics. All mentions of AI here small as they are are completely speculative.
@Dulaman yeah because managing orbit trajectories for 10000+ satellites is so easy.
It’s also worth noting that a single ai data centre typically only uses a couple hundred MW. The entire analysis you posted is based on a single line Elon musk tweeted where he says 100GW of computing per year.
This likely refers to the energy required to launch the satellites rather than power the compute. It’s not feasible to produce 100GW unless you did nuclear in space, but then you would have a space station, not multiple satellite launches like musk implies.
See below for the exact words of the article:
The 100 GW figure comes directly from Elon Musk, who stated on X in November 2025 that SpaceX aims to deploy 100GW of high-Earth-orbit energy per year within roughly five years. That single line became the catalyst for this analysis
