Here's an ASK HN thread I posted on whether Moore's Law is dead or not:
It seems to descend a bit into a semantic argument about what you mean by "Moore's Law". If you go by transistor count, the consensus in that particular thread is that we're still meeting the technical predictions every two years. But if you go by cost and power, those seem to have come unglued over the last decade and so what "Moore's Law" meant to the average computer enthusiast in 1999 seems to no longer be operating and is thus "dead."
So for the purposes of THIS market I define "The Spirit of Moore's Law" as being:
About every two years, computers get about twice as powerful for the same price
About every two years, computers get about twice as cheap for the same power
This is important because it means you can just rely on hardware advancement to make your old code faster.
Also, whether the "Spirit" of Moore's Law is dead or not has important implications for AI timelines, even if the literal transistor count part is still on track.
Essentially this market is looking for a definitive answer to whether the ongoing free lunch of cheap computational advancement I experienced in my childhood and young adulthood has substantially slowed down.
On January 1, 2024, I'll review the evidence. If I'm convinced the party is over, this resolves YES. If I'm not convinced or I'm only kinda convinced but there's a strong lingering reasonable doubt, this resolves NO.
I won't bet in this market, but I will subsidize it.
To be clear, the way to profit on this market is: bet in the direction you think is true, and then go gather as much evidence as you can to convince me that's correct, and post it here.
One may have an argument that Moores law is still true if we only look at transistor count, thanks to GPUs, but you seem to have something stronger in mind. You say that the spirit of Moores law would imply that "you can just rely on hardware advancement to make your old code faster" but this would not be true for most programs running on CPUs as they would be slower of we ran them on GPUs. I do not think it makes sense to generalize Moores law to GPUs as they are completely different architectures, have different performance characteristics, and should be evaluated on their own terms; not compared to CPUs.
@Pazzaz the total computation power on the planet Earth doubles every...
Here is a paper from 2020: "In the twilight of Moore’s law, GPUs and other specialized hardware accelerators have dramati- cally sped up neural network training." IN THE TWILIGHT OF MOORES LAW! That means it's ending. peer reviewed paper.
Quite obviously happening.