New Data-Poisoning Tool Aims to Help Artists Fight Back Against AI

Published:Fri, 19 Jan 2024 / Source:https://www.ign.com/articles/new-data-poisoning-tool-aims-to-help-artists-fight-back-against-ai

As AI continues to be a double-edged sword in the digital landscape, a new data-poisoning tool will allow artists to reclaim control over their creative work and void any AI-generated replications.

Announced by the University of Chicago's Glaze team, the first version of Nightshade is available to download. On its official website, the team notes that Nightshade is similar to its other tool, Glaze.

Instead of being a defensive tool, however, Nightshade is takes more of an offensive approach that allows artists to "distort feature representations inside generative AI Image models." Nightshade is supposed to "poison" AI models if the original art and images you created were repurposed without your consent.

In contrast, Glaze protects users from AI imitation. The Glaze team recommends using Nightshade first, then Galze as a one-two punch. While both are separate tools, the team confirmed they are working on integrating both.

AI models have been the subject of controversy for stealing art from artists. Some are pursuing legal action, and others are vocal, expressing disapproval for AI models being used to steal their work. This new tool will allow artists to fight against these AI models stealing their work and "increase the cost of training on unlicensed data."

Earlier this week, Square Enix revealed that its upcoming game Foamstars includes some artwork made by artificial intelligence. Wizards of the Coast had to issue a correction earlier this month after it was revealed that its widely popular card game Magic: The Gathering used AI for some of its artwork, despite the company previously claiming otherwise.

Taylor is a Reporter at IGN. You can follow her on Twitter @TayNixster.

Source:https://www.ign.com/articles/new-data-poisoning-tool-aims-to-help-artists-fight-back-against-ai

More