Will risk from misaligned AI be popularly perceived to be greater at close (2027) than as of market creation (2022)?
➕
Plus
32
Ṁ8923
2028
89%
chance

Resolves according to my subjective judgement. I will welcome stakeholder input when resolving the question but reserve the right to resolve contrary to the opinion of market participants if necessary. I reserve the right to bet on this market but tentatively expect to stop betting in the last two years (2026-2027).

Resolves as N/A if there doesn't seem to be a clear answer at close or if AI has killed / subjugated 95%+ of humans. ;)

Get
Ṁ1,000
and
S3.00
Sort by:
firstuserhereboughtṀ333YES

@firstuserhere I put up a moderate limit order at 92%.

If you are interested/sure of your position, I'll do some more research myself and place a much larger one.

Betting this down because currently, public polling says >50% (55%) of people (in the US) think AI could wipe out humanity.

I can see more than 1 in 10 worlds that as this gets more politicized, support drops mildly (more people answer IDK, or it goes along party lines, etc).

“AI alignment is populated by the worst power seeking people with the worst Bayesian priors ever, example #3827392”

If AI has subjugated 95% of humans, that certainly seems like it would lead to a widespread perception of AI having been risky. Why should that cause this market to resolve N/A?

predictedYES

@IsaacKing Meant more as a Very Tasteful Joke, so I'm going to stick by the description even if the spirit of the market might suggest it should resolve positively!

Why does no one associated with “AI alignment” go five minutes without wishing for their own annihilation or enslavement.

Take your death cult (or slave morality) and by all means practice it, but don’t pretend it doesn’t make your laughably unqualified to speak about serious issues.

Obviously concerns about AI will increase as the power increases (leaving aside that “alignment” is mostly nonsense and that truly threatening silicon compute is at least a couple orders of magnitude away—and this is mostly about cost per flop than anything else)

predictedYES

@Gigacasting If alignment is mostly nonsense then maybe public discourse will come around to your view with another few years to marinate? If alignment is no longer a concern then I think the market would resolve negatively, so maybe you should bet no.

If we're still alive and don't have AGI then I strongly expect this to be the case. If we're still alive and we do have AGI then I don't expect to care about manifold bucks.

Will risk from misaligned AI be popularly perceived to be greater at close (2027) than as of market creation (2022)?, 8k, beautiful, illustration, trending on art station, picture of the day, epic composition

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules