This resolves to "YES" if any laws are passed (or judicial rulings made with the force of law) in the USA that has the specific intent of hindering generative AI art projects and/or companies. Examples include, but are not limited to:
Copyright laws/rulings that make it clear that training generative AI on copyrighted imagery counts as making a derivative work
Laws/rulings regarding people's personal likenesses that impose restrictions on tools & software capable of generating them, or penalties on those who create them
Laws/rulings against creating e.g. pornography with somebody's face without their consent
State and Federal laws and judicial rulings all qualify.
This is a fairly broad brush I'm painting with, and will be resolved subjectively in my sole (but trying to be fair) opinion, BUT there are two objective tests I will subject all the above criteria to:
MUST BE NEW LAW: the legislation or judicial ruling has to be new. It can't have already existed as of when this market was posted.
MUST BURDEN THE SOFTWARE/COMPANY: it's not enough to say "no individual is allowed to produce a picture of type X using these tools," because arguably that's already the case with e.g. photoshop. It has to have some burden that attaches liability to an individual or company who produces or maintains or trains software that can produce a picture of type X.
Close date updated to 2023-12-31 11:59 pm
What law resolved this yes? EDIT: Sorry I don't know why I thought this resolved YES.
Some people are getting their lobbying act together:
https://www.gofundme.com/f/protecting-artists-from-ai-technologies?utm_campaign=m_pd+share-sheet&utm_content=undefined&utm_medium=social&utm_source=twitter&utm_term=undefined
@LarsDoucet if a law were to pass specifically about the legal status of generated CSAM, would that count here (assuming that law doesn't target generative models more generally)?
@VivaLaPanda If, on analysis, it turns out to burden the tool in some general way, rather than just attach liability to an individual who uses the tool to generate that stuff. The test is basically "is it basically the same liability for making the image with Photoshop as with a generative model," or does it put fundamental restrictions on everyone who uses the generative software in a way that doesn't apply to Photoshop.
Put another way -- if it is currently legal to produce and ship a tool capable of generating a certain category of image today, but tomorrow it becomes illegal to produce and ship a tool capable of generating that same category of image, then sure, it counts.
@LarsDoucet Selling down in response then. Still think it's not super likely because law is slow, but there's lots of precedent for more aggressive legal action wrt CSAM, which will almost certainly become an issue
@VivaLaPanda Yeah. Basically what I'm trying to capture here is companies being forced to significantly alter their models, limiting what everyone can produce with them. Stricter CSAM laws presumably wouldn't force Adobe to make it "less possible" to manually make fictional CSAM images in Photoshop, but I could see it with e.g. the large diffusion based image models, b/c you never know what you can tease those into producing. Timeline is definitely tight for the general speed of law.
@Multicore Assuming they burden the tool generally and not just the user choosing to use it in that specific way, totally.
@horse Yeah probably not. But state laws do count, and there are fifty of them. Just takes one.
@MartinRandall We look at the effects of the law. If, for instance, they rule that training on copyrighted art is unequivocally a violation of copyright that has the practical effect of making things harder, and resolves YES