Disclaimers:
This question is part of Foresight’s 2023 Vision Weekends to help spark discussion amongst participants, so the phrasing and resolution criteria may be vaguer than I would normally like for this site. Apologies for that. We thought it would still be useful to make the market public to potentially inform other discussions.
If you would to add alternative answers, please do so in the comments!
It depends on how advanced, but abuse and economic. For the "AI kills all humans to make paperclips" scenario you need a really strong AGI. For the "AI takes most jobs, causing unemployment, economic crisis and ballooning inequality" you don't need it to be even fully human level. With our current economic model it's likely that very small groups of people will get all the economic benefits of AI.