Background
The Take It Down Act aims to protect victims of non-consensual intimate imagery (NCII), including AI-generated deepfakes, by criminalizing their publication and requiring platforms to remove such content within 48 hours of notification. The bill has passed the U.S. Senate and is being considered by the House. Trump has said he supports it, and highlighted the First Lady’s strong advocacy for it during his joint address to Congress on March 4, 2025.
Despite efforts in the Senate to add first amendment protections, several advocacy groups, including the Electronic Frontier Foundation (EFF) and the Center for Democracy & Technology (CDT), have raised concerns about potential unintended consequences of the bill's current language. These concerns center around the bill's takedown provisions and their potential impact on free speech.
Resolution Criteria
This market will resolve to YES if, within two years of the Take It Down Act becoming federal law:
At least one documented case emerges where the Act is used to take down or attempt to take down content created by:
A journalist reporting on matters of public interest
A satirist creating parody or commentary
The market will resolve to NO if:
The Take It Down Act does not become law within two years of this market's creation
The Act becomes law but no documented cases of it being used against journalists or satirists emerge within two years of its enactment
Considerations
The Act's primary purpose is to protect victims of NCII, not to target legitimate journalism or satire
The bill's notice and takedown mechanism could potentially be misused to remove constitutionally protected speech
The current version lacks clear exceptions for matters of public concern or robust protections against false takedown requests
Implementation details and potential amendments to the bill could significantly affect the likelihood of this outcome
Update 2025-03-18 (PST) (AI summary of creator comment): Definition of Parody or Commentary:
Content that makes light of public figures or public discourse qualifies as parody or commentary.
This is distinct from takedowns targeting the protection of private citizens’ likenesses from use in porn.
Documented instances, as publicized by legal watchdog agencies, will be considered for market resolution.
@jcb if it is making light of public figures or public discourse it will be considered parody or commentary. Distinct from the law’s intended target, which is protecting the likenesses of private citizens from use in porn. I expect the legal watchdog agencies currently tracking this law will publicize any such case.