If there exists a super-intelligent AI, would majority of AI researchers answer Yes to "Have we reached AGI?" ?
22
396
Ṁ736Ṁ410
2031
61%
chance
1D
1W
1M
ALL
Super-intelligent AI :
"Something along the lines of -> smarter than humans at most cognitive tasks, very very good at some key tasks, and can afford to be indifferent too anything it can't do." (@Duncn's comment)
"AI that is better than majority of the humans at most economically valuable tasks, but not necessarily better than the best humans in all of those tasks."
(I created this market to gauge opinion for @Primer's question)
Get Ṁ200 play money
Sort by:
@ShadowyZephyr Resolves whenever there is such a survey and such a superintelligent AI, until then market trades according to what that survey will point to
More related questions
Related questions
Will AI be capable of superhuman persuasion well before (>1yr) superhuman general intelligence?
52% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2040?
45% chance
In which year will a majority of AI researchers concur that a superintelligent, fairly general AI has been realized?
Will we have at least one more AI winter before AGI is realized?
60% chance
Will a major AI company acknowledge the possibility of conscious AIs by 2026?
53% chance
Conditional on their being no AI takeoff before 2030, will the majority of AI researchers believe that AI alignment is solved?
37% chance
When will AI be better than humans at AI research? (Basically AGI)
By 2025, will most well-educated people expect AI to within 10 years be better at intellectual work than 99% of humans?
19% chance
Will AI surpass humans in conducting scientific research by 2030?
36% chance
Will humans create AGI, either directly or indirectly, within the next 24 months?
29% chance