Gary Marcus prediction: physical reasoning of AI systems will improve before psychological reasoning

Resolves very subjectivly; don't bet if you don't trust my judgement.

Physical reasoning includes things like driving cars, cleaning houses, and other use-cases that involve understanding directions and distances in the real world and manipulating physical objects.

Psychological reasoning includes things like understanding human thoughts and ideas, making art and mathematical discoveries, writing computer programs, etc.

This market resolves once it seems that AI's mastery of one of those domains has radically improved over the status quo in January of 2023. For example if before we reach AGI we get self-driving cars that are radically safer than humans in all situations, this will probably resolve YES. If AI comes up with some deep insight to solve a mathematical problem humans have failed to solve for many years, this will probably resolve NO.

(If they're both solved at the same time by the same system that has reached AGI, this resolves to 50% or N/A, whichever traders think makes more sense.)

Get Ṁ600 play money
Sort by:

I feel as though your examples should be reversed.

Much of what (in my opinion) makes self-driving a difficult problem is understanding the human psychology of other drivers, but the physical challenges of it I would consider to be more well-defined for computation, and I would expect them be overcome earlier.

How is mathematics "psychological reasoning"? Mathematics is not inherently psychological, is it?

@cloudprism I would absolutely bet YES if I understood the resolution criteria

Why is this market SOOOOOO HIGH?

The most impressive models to date are LLMs which understand the world through words. World's were designed for, and are best at, expressing thoughts and feelings. Of course that's where these models will excel.

@jonsimon *Words were designed for

More related questions