Going to explain my vote a bit. I think if we consider 95% of humans to have general intelligence, it's difficult for me to say that OpenAI o3 high compute model doesn't have general intelligence. I am certain that it would outperform the median human on non-specialized tasks. That said, I also am of the opinion that most people aren't great thinkers, and general intelligence really isn't as useful as we may historically have suspected. It turns out a lot of skills require specialized intelligence.
As an example, consider that the "expected AGI" date is 2028 (https://manifold.markets/ManifoldAI/agi-when-resolves-to-the-year-in-wh-d5c5ad8e4708), but there's less than 50% confidence (as of writing) that an AI, even a specialized one, could make a high quality movie (https://manifold.markets/ScottAlexander/in-2028-will-an-ai-be-able-to-gener).
I get the feeling that we've lost a distinction between AGI and a superintelligent AI.