As AI continues to be a double-edged sword in the digital landscape, a new data-poisoning tool will allow artists to reclaim control over their creative work and void any AI-generated replications.
Announced by the University of Chicago's Glaze team, the first version of Nightshade is available to download. On its official website, the team notes that Nightshade is similar to its other tool, Glaze.
Today is the day. Nightshade v1.0 is ready. Performance tuning is done, UI fixes are done.
— Glaze at UChicago (@TheGlazeProject) January 19, 2024
You can download Nightshade v1.0 fromhttps://t.co/knwLJSRrRh
Please read the what-is page and also the User's Guide on how to run Nightshade. It is a bit more involved than Glaze
Instead of being a defensive tool, however, Nightshade is takes more of an offensive approach that allows artists to "distort feature representations inside generative AI Image models." Nightshade is supposed to "poison" AI models if the original art and images you created were repurposed without your consent.
In contrast, Glaze protects users from AI imitation. The Glaze team recommends using Nightshade first, then Galze as a one-two punch. While both are separate tools, the team confirmed they are working on integrating both.
AI models have been the subject of controversy for stealing art from artists. Some are pursuing legal action, and others are vocal, expressing disapproval for AI models being used to steal their work. This new tool will allow artists to fight against these AI models stealing their work and "increase the cost of training on unlicensed data."
Earlier this week, Square Enix revealed that its upcoming game Foamstars includes some artwork made by artificial intelligence. Wizards of the Coast had to issue a correction earlier this month after it was revealed that its widely popular card game Magic: The Gathering used AI for some of its artwork, despite the company previously claiming otherwise.
Taylor is a Reporter at IGN. You can follow her on Twitter @TayNixster.