MANIFOLD
5. Depreciation schedules become critically important, as debt plays growing role in the AI buildout [See full title!]
8
Ṁ1kṀ978
Dec 31
76%
chance
  • All these predictions are taken from Forbes/Rob Toews' "10 AI Predictions For 2026".

  • For the 2025 predictions you can find them here, and their resolution here.

  • You can find all the markets under the tag [2026 Forbes AI predictions].

  • Note that I will resolve to whatever Forbes/Rob Toews say in their resolution article for 2026's predictions, even if I or others disagree with his decision.

  • I might bet in this market, as I have no power over the resolution.

    Full title: A mundane and esoteric accounting concept — depreciation schedules — will become critically important, especially as debt plays a growing role in the AI infrastructure buildout.

    Description of this prediction from the article:

    AI is an exciting and futuristic space. Accounting is not. Yet a seemingly boring and obscure accounting concept will become critically important for the field of AI in 2026. Get ready to start hearing a lot about depreciation schedules for AI chips.

    Why does this matter?

    Let’s zoom out. When a company acquires any long-lived asset (like a chip), it doesn’t treat the full cost of the asset as an expense upfront. Instead, it estimates the useful life of the asset and then spreads the asset’s cost out across that period of time. This is known as depreciation. So, for instance, if a company buys a piece of machinery for $10 million, and believes that the machine will productively function for 10 years, the company will depreciate its cost over that 10-year period, meaning that it recognizes $1 million in cost per year for 10 years.

    We are in the midst of the largest capital investment in the history of humanity. In 2025 alone, the hyperscalers will invest $325 billion to build out AI data centers and infrastructure. By 2030, McKinsey estimates that a total of $6.7 trillion will be spent globally on AI infrastructure.

    To put these figures in perspective, the total amount spent on the Manhattan Project was $30 billion (inflation-adjusted to 2025 dollars). The total spent on the entire Apollo Program — which spanned multiple decades during the Space Race with the Soviet Union — was $300 billion (again, inflation-adjusted). No previous human undertaking in history matches the scale and scope of today’s AI infrastructure buildout.

    And the single biggest source of spend within this AI infrastructure buildout — about half of the total — is on AI chips, like Nvidia’s GPUs and Google’s TPUs.


    Historically, it has been common to depreciate chips, servers and other computing resources over a five-year period. In general, this has served as a reasonable assumption about how long chips last before they need to be replaced.

    But in today’s AI era, things move faster than they ever have before.

    Nvidia now releases a major new GPU model every year or so. Many customers demand the most cutting-edge AI hardware in order to stay competitive. So AI chips become obsolete faster than they ever have before (even when they still technically work).


    In 2023, Nvidia H100s were brand-new, top-of-the-line GPUs that everyone wanted to get their hands on. By 2024, Nvidia H200s had rolled out and were the most sought-after AI chip. This year saw the arrival of Nvidia’s Blackwell line of chips, with the B200 assuming the mantle as the most in-demand AI hardware. The B300 (Blackwell Ultra) is just about to start rolling out to customers. Don’t get too used to it, though: 2026 will see the arrival of Nvidia’s new Rubin architecture, representing the next full generational leap for AI GPUs.


    This brings us to the crux of this issue: should companies that own AI chips — cloud providers like Amazon and Microsoft, AI labs like OpenAI and Anthropic, data center companies like Equinix and Digital Realty, neoclouds like CoreWeave and Nebius, among others — use a meaningfully shorter depreciation schedule than they historically have when accounting for their chip investments (say, one or two years)?

    A simple example will help illustrate why this question matters so much. Say a cloud company spends $50 billion on AI chips. If the company uses a five-year linear depreciation schedule for the chips, that translates to a $10 billion annual loss. If the company uses a two-year linear depreciation schedule, that’s a $25 billion annual loss. In other words, it’s a $15 billion swing in profitability based purely on accounting, with no changes to the company’s underlying cash flows or operations.


    This debate will have significant real-world implications. It will influence the broader discourse about the profitability of AI and the long-term financial sustainability of the entire field. Longer depreciation schedules will support narratives that AI margins are improving and that AI-based businesses can be highly profitable. Shorter depreciation schedules, on the other hand, will strengthen the perception that AI, while it may be a powerful technology, is so capital-hungry that its economic returns and viability remain unclear.


    If companies choose long depreciation schedules (say, five years) and then reality moves much faster (say, demand for those chips falls off a cliff after two years), this can lead to massive impairment charges that can abruptly transform a company’s or even sector’s financial health. An “impairment bomb” scenario like this is exactly what played out with the fiber overbuild in the early days of the internet.

    The potential pitfalls are further amplified when debt levels ramp up, which is exactly what has happened over the past year in the world of AI infrastructure.

    Treating AI chips as long-lived assets with long depreciation schedules can encourage the introduction of greater leverage into the system — possibly too much leverage.

    Lenders assess a company’s financial profile when deciding whether to lend money and on what terms. Longer depreciation schedules make earnings and coverage ratios look better, which can make lenders comfortable with higher levels of debt.

    Loans for AI capital investment are often secured by the chips themselves. It is easy to imagine how things might go wrong if large volumes of long-term loans (say, 10-15-year debt instruments) are secured by assets that in reality become obsolete in 18-36 months. A mismatch between long-lived debt and short-lived economics has been the cause of many a financial crisis.

    Heading into 2026, as concerns continue to mount about a potential AI bubble and an infrastructure overbuild, these otherwise arcane accounting details will suddenly become keenly interesting to many people.


    If there is one company around which this debate will focus most acutely next year, it is CoreWeave (NASDAQ: CRWV). Today, CoreWeave uses long depreciation schedules for its GPUs (up to six years). Much more than the hyperscalers, GPUs are the dominant asset on CoreWeave’s balance sheet and the dominant collateral behind its debt. And CoreWeave has taken on a lot of debt. It will be fascinating to watch this story play out in the headlines, and in CoreWeave’s stock price, over the course of 2026.

Market context
Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.TermsPrivacy