https://garymarcus.substack.com/p/agi-will-not-happen-in-your-lifetime
Resolves very subjectivly; don't bet if you don't trust my judgement.
Physical reasoning includes things like driving cars, cleaning houses, and other use-cases that involve understanding directions and distances in the real world and manipulating physical objects.
Psychological reasoning includes things like understanding human thoughts and ideas, making art and mathematical discoveries, writing computer programs, etc.
This market resolves once it seems that AI's mastery of one of those domains has radically improved over the status quo in January of 2023. For example if before we reach AGI we get self-driving cars that are radically safer than humans in all situations, this will probably resolve YES. If AI comes up with some deep insight to solve a mathematical problem humans have failed to solve for many years, this will probably resolve NO.
(If they're both solved at the same time by the same system that has reached AGI, this resolves to 50% or N/A, whichever traders think makes more sense.)
I feel as though your examples should be reversed.
Much of what (in my opinion) makes self-driving a difficult problem is understanding the human psychology of other drivers, but the physical challenges of it I would consider to be more well-defined for computation, and I would expect them be overcome earlier.
How is mathematics "psychological reasoning"? Mathematics is not inherently psychological, is it?