Will we get AGI before 2026?
➕
Plus
81
Ṁ11k
2025
13%
chance

AGI defined as transformative, at least human-level, but potentially superhuman AI, capable of doing at least any cognitive task humans can do, at least at human level.

Get
Ṁ1,000
and
S3.00
Sort by:

End date of this market seems to be incorrect. Currently 2023

predictedYES

@Entropy Ah yes, I left in the default date.

We don’t have an agreed upon definition for intelligence, AI or super human AI. You define AGI as superhuman-AI, which I doubt will have an agreed upon definition by 2026.

predictedYES

@capybara I do define AGI as ASI, because I think AGI will already be superhuman at several tasks as soon as it can be defined "AGI".

A useful definition for intelligence could be: "Given a goal, a more intelligent agent is able to solve it better than a less intelligent agent". Obviously "better" depends on what metrics you care about, which depends on the goal, but as intelligence is a general concept, the description needs to be general as well.

For example, an AI more intelligent at chess, would beat one less intelligent more often than it would lose.

An AGI would need to be human-level at every cognitive task, meaning that it would need to perform at least as well, or better than humans at those tasks.

If you can pick a cognitive task that the AI can't do as well as a human, then it's not yet AGI.

Even at that level, the AI might be transformative, but I suspect that the closer we get to "every cognitive task", the sooner the gap will close on the things it can't do, and soon after, it will surpass human level at most if not every task.

Hard no. Any cognitive task a human can do means at the cutting edge of hard science. 2026, I think is too early for that

@ThanhDo Yes, correct, I mean any and every task, including the development and improvement of AGI, which includes itself (self-improvement).

Thanks for the vote, I hope you're correct.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules