Will data poisoning cause problems for AI image generators in 2024?
29
1kṀ5195
resolved Jan 1
Resolved
NO

Nightshade, a tool for "poisoning" images and making them unusable for training AI models, has recently been released. Many artists have expressed interest in using tools like Nightshade to prevent their art from being used to train image generators.

This market will resolve to YES if any of the following happens by the end of 2024:

  • Widespread poisoned data causes noticeable degradation in the outputs of a major AI image generator.

  • Notable amounts of effort are devoted to filtering out or altering poisoned images in datasets. For example, regularly being forced to do extra preprocessing to avoid data poisoning for a large portion of images in a dataset would count.

  • AI companies make some form of concessions to artists who poison their images such as no longer training on their work or paying royalties to the artists.

I won't bet on this market.

Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ281
2Ṁ53
3Ṁ20
4Ṁ15
5Ṁ11
© Manifold Markets, Inc.TermsPrivacy