Will a deepfake cause any Manifold question to be mis-resolved in 2023?
46
329
870
resolved Dec 31
Resolved
NO

In light of concerns raised about the authenticity of the Prighozin Telegram video, I suspect that the standard of evidence required to resolve a Manifold question is in some cases low enough that a deepfake video could push a market to resolve early and resolve incorrectly. Will that happen in 2023?

Market Resolution Criteria:

  1. I will resolve YES if any question is resolved in 2023 and later requires re-resolution due to substantial evidence or strong suspicion that the original resolution relied on deepfake technology in the video evidence.

  2. In case of a lack of consensus on whether deepfake technology was used, a secondary market will be created to answer that specific question.

  3. If the cause of video evidence manipulation is determined to be old-school forgery techniques such as body doubles, prosthetics, or masks, this question will resolve NO.

  4. If any questions related to video evidence manipulation are actively discussed at the time of market close, the resolution will be delayed for up to one month to allow those discussions to conclude. If a market is re-resolved after this one, the re-resolution request will apply to this market as well.

  5. I reserve the right to ignore any Mani question that I believe that has been created in bad faith to distort this market. To be honest if it's a meta question or one that relates specifically to a Manifolder, it's probably out.

To ensure impartiality, I will not participate in this market as a buyer.

Get Ṁ200 play money

🏅 Top traders

#NameTotal profit
1Ṁ249
2Ṁ175
3Ṁ150
4Ṁ103
5Ṁ91
Sort by:

If you’ve seen anything that might trigger this, now’s the time to speak up! As things stand this will resolve NO.

I’ve bit seen anything that met the criteria since I’ve opened this market, has anyone seen anything particularly suspect?

predicted YES

Do videos of superconductor replication attempts count, if they are found to be faked?

predicted NO

@AnT Wiki: Deepfakes (portmanteau of "deep learning" and "fake"[1]) are synthetic media[2] that have been digitally manipulated to replace one person's likeness convincingly with that of another.

IMO deepfakes is strictly limited to likeness of people, other types of faked material does not apply to this question.

yeah, what @HenriThunberg said

bought Ṁ100 of YES

I know of somewhere this happened, I'll be sending all the details who did it for what reason together with the specific model they used and material by a 3rd party on how to detect this particular type of deep fake. I don't want this to be public though.

bought Ṁ10 of YES

@levifinkelstein interesting market

bought Ṁ30 of NO

Could you maybe add a condition that deliberate attempts ny Manifold users to cause misresolution on Manifold are not allowed?

@HenriThunberg Yeah, that’s a good shout. Maybe I should just exclude any market resolved on video evidence where the video’s original source is a Manifold user?

predicted NO

@Noit what about case where

User A convinces User B to create a market based on a video they know is a deepfake. This is not known beforehand, but revealed by A after resolution, in order to satisfy the YES condition in this market.

At this point, I expect any kind of shenanigans from Manifold users haha.

@HenriThunberg Yeah, or even just a market on “will x random person on Twitter declare their support for Y politician” where no manifolder would have any meaningful way of knowing that the Twitter user was not a manifolder or a supporter of Y to start with.

Maybe I’ll just caveat that if I think a market has been misresolved deliberately to fuck this market about then I will resolve NO. The spirit of this market is about public trust in deepfakes and not whether it’s possible to convince your mate you like anchovy pizza with a deepfake.

predicted NO

@Noit sounds good to me 🤞