
Nightshade, a tool for "poisoning" images and making them unusable for training AI models, has recently been released. Many artists have expressed interest in using tools like Nightshade to prevent their art from being used to train image generators.
This market will resolve to YES if any of the following happens by the end of 2024:
Widespread poisoned data causes noticeable degradation in the outputs of a major AI image generator.
Notable amounts of effort are devoted to filtering out or altering poisoned images in datasets. For example, regularly being forced to do extra preprocessing to avoid data poisoning for a large portion of images in a dataset would count.
AI companies make some form of concessions to artists who poison their images such as no longer training on their work or paying royalties to the artists.
I won't bet on this market.
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ281 | |
2 | Ṁ53 | |
3 | Ṁ20 | |
4 | Ṁ15 | |
5 | Ṁ11 |