If AI wipes out humanity, will it resolve applicable markets correctly?
12
111
250
10000
37%
chance

If at least one of the "will AI wipe out humanity" markets is resolved to YES by AI after AI wipes out humanity, this resolves yes.

Get Ṁ200 play money
Sort by:
bought Ṁ10 of YES

We should bet YES. If the AI does for some reason resolve the markets correctly, it will resolve this one to YES as well. If it doesn't resolve them correctly, then it has no reason to resolve this NO, since it wasn't resolving markets correctly in the first place. Maybe it will even intentionally misresolve to YES for the same reason it misresolved the other markets.

@JosephNoonan All the AI risk markets are basically just symbolic, because we'd literally be dead. Expected returns doesn't matter in this case lol

predicts YES

@ShadowyZephyr I know, but what if I want to symbolically bet on the "rational" option (i.e., the one that maximizes expected profits) anyway?

@JosephNoonan Then go ahead, but why "we should bet YES?" Why not "I am betting YES because x?"


Do you think other people should bet the 'rational, maximizing value' way on the xrisk markets (down to 0%)?

predicts YES

@ShadowyZephyr If it wasn't for the fact that you will lose short-term profits from that, and the fact that there are much better ways to profit, then sure, it makes a lot more sense to bet those markets down to 0 than betting YES, unless you have some other incentive

bought Ṁ10 of YES

@JosephNoonan The reason people bet YES on the AI doom markets is to say they are part of the cool group that believes that we are doomed.

self referential, non-predictive

How does this resolve if it only resolves some of them correctly?

predicts NO

@MartinRandall If at least one of the "will AI wipe out humanity" markets is resolved to YES by AI after AI wipes out humanity, this resolves yes.

@JonathanRay makes sense. Maybe update the description.

How can we hedge our misresolution risk? If it misresolves the others, it's likely to misresolve this one too.

predicts NO

@ShadowyZephyr At some point manifold may add an option to make a market autoresolve based on the outcome of other markets, Boolean logic, and expiration dates. There can be a market for “does humanity still exist” which autoresolves NO when we stop extending the autoresolve date. Then this can be autoresolved based on that if we assume AI is the only significant x-risk