How many AI generated spoof calls will try to influence US 2024 elections?
Standard
53
Ṁ3354
Nov 7
19%
Zero
25%
1 to 3
18%
4 to 6
38%
More than six

Between now (Feb 3rd) and Election Day in the US in 2024, how many incidents will we learn about where voice clones attempt to misinform voters via phone calls?

Any call campaign spoofing a candidate or a political party to voters counts. Report of such calls needs to come from a reputable news source. 1 campaign = 1 count.

Closes at the end of November 5th 2024.

Numbers in the choices are inclusive.

This is an example, but does not count as it precedes the market.

https://www.politico.com/news/2024/01/29/biden-robocall-ai-trust-deficit-00138449

Current known count: 0

Get
Ṁ1,000
and
S1.00
Sort by:

It looks like the company behind that last attempt was found and will be prosecuted. This may deter future attempts:


https://www.nytimes.com/2024/02/06/us/politics/biden-robocall-new-hampshire.html?smid=nytcore-ios-share&referringSource=articleShare

Does AI have to be involved, or is any lying about Identity sufficient? (I assume that's what you mean by "spoof")

@kenakofer The ones that count here have to be scale campaigns (not a person to person call) and has to lie about the callers identity indeed. I think computer generated voices are the most likely method of choice for pulling that off. Usually they get reported as a suspected use of AI which is good enough here.

If a campaign that spoofs an identity and misleads about elections is confirmed to not be using AI then it would not count. Do you know of such an example from recent past?

@voodoo "robo calls" have been a thing for a long time. I don't know how those are typically built, but I imagine machine learning is only sometimes used, and then a single soundfile is played on the many phone calls at scale.

@kenakofer Right, robocalls are usually an obviously robotic message not trying to impersonate someone. That kind of call would not count.

@voodoo Which of the following would count?

  1. A human Trump impersonator records a message claiming to be Trump. The recording is played at scale on phone calls.

  2. Like 1, but the impersonator is not very good and sounds almost nothing like Trump.

  3. Like 1, but the masterful impersonator only implies being Trump.

  4. Like 1, but media reports disagree on whether its a recording of an AI or a recording of a human impersonator.

  5. Someone cut-and-pastes sound bytes of Trump together to form a deceptive recording that is played at scale on phone calls. (With similar variants like 2-4)

@kenakofer Love the examples thanks for the clarity.

Human impersonator doesn’t count. So #1, 2, 3 don’t count. #5 (cut and paste clips) doesn’t count either.

#4 where we’re not sure will be judgment based and I welcome discussion in comments but will lean towards the assumption that it’s AI until proven otherwise.

This is depressingly high.

Are there any inclusion criteria, like a report on a reputable news source? I could probably generate 6 calls in an afternoon and post them on YouTube, doesn't mean they are impactful.

@Thomas42 Good point. Yes it has to be reported by a reputable media source.