A new data “poisoning” tool is reportedly giving artists the ability to fight back against generative AI, causing artificial intelligence training data to become scrambled when tech giants scrape their original work. As one artist explains, “It is going to make [AI companies] think twice, because they have the possibility of destroying their entire model by taking our work without our consent.”

The tool, called Nightshade, “poisons” AI training data in ways that could cause “serious damage” to image-generating AI models, as it could harm future iterations of the technology by rendering it outputs useless, according to new research obtained by MIT Technology Review.

Posted in

Valerie

Leave a Comment

You must be logged in to post a comment.