What is the best definition of AGI?
46
96
resolved May 3
An artificial system that can follow and apply any set of instructions an average human can follow and apply
An artificial intelligence which outperforms humans on all cognitive tasks
An artificial intelligence that outperforms an average human at most tasks
Highly autonomous systems that outperform humans at most economically valuable work
The ability of AI systems to perform all tasks that humans can perform
AI as capable as at least one human who has "general intelligence"
Other

This is a preliminary poll for /singer/what-is-the-best-definition-of-agi

It won't be used in any way for the resolution, but it might give traders an idea for what definition will win in the final poll at the end of the year.

Get แน€200 play money
Sort by:

Anything with "Average human" in it fails for me.

Generality is emergent capability based on the ability to 'abstract' from patterns data trained on other capabilities.

Higher generality would be a greater propensity to have emergent capabilities when compared to trained capabilities. ( Greater abstraction, more generalised models/tools)

You could have a machine with higher generality that is higher propensity to generalize, and be less capable than a human being. That would be MORE general but less capable.

Generalisation in a sense is already achieved to some degree and I think there is no threshold to be had from talking about AGI.

ASI on the other hand is one that may be met by comparing AGI capabilities to that of a human. But that is only achieved when it exceeds all humans.


Generality for silicon for scale appears to be less than that of biological systems. And hence when it's generality increases such that you could say that it's generality (emergent capabilities/trained capabilities) is parity of an average human, it would already be super human.


AGI in a sense may already be achieved, just in a limited domain.

I wonder if you need to add 'perceived' average human too.

I think people (especially those who think about AGIs) think people are far more competent/intelligent than they actually are on average...

I prefer the economical way: when the algorithm can autonomously be responsible for X% of worldwide GDP.

  • Saying stuff like "mental" or "most tasks" is not practically measurable.

  • Using humans as a baseline is mistaken since it presumes the generality of humans whilst humans are pretty lacking at general skills like calculating and memorizing and are really good at niche skills like facial recognition and finger dexterity.

Also certain tasks like "being human" which I predict we will start valueing more and more without necessarily acknowledging that, are not accomplishable by definition. This seems silly now that I articulate it, but this fallacy sure looks like the destination our goalpost is moving towards.

@Jono3h What's your value of X?

@MaxHarms 1

I think?

I think it's important to include the word "cognitive" or "mental" in most of these. Obviously a paralyzed genius can't play basketball as well as the average person, and this has nothing to do with their intellect.

My personal answer is "AGI is an artificial intelligence that outperforms an average human at most mental tasks"