
Video Walkthrough of Site
NHL Data Science Project
Joah Boland-Landa
Remembrance Day Vector Poppy

Mailchimp Campaign
My Tech Article
Nightshade Image Protection:
Nightshade is an open-source filter for artists to protect their works from image generation programs.
Ever since their introduction, generative AI image generation models and their demonstrators faced backlash and criticism for oftentimes taking real artists pieces and scraping or copying them to train their models, all without the artist’s consent. There are questions of copyright infringement, intellectual property misuse, and a general feeling that artists didn’t consent to have their Works contribute to a tool that undermines their work. and for a while, there was little they could do about it. Efforts to sue for copyright infringement often went nowhere, opt-out lists often aren’t respected, and it was hard to find any solid proof that their work was being taken. So efforts then moved to trying to protect works that they would make in the future. and thus, led to the development of the Nightshade image protection program.
This led to the University of Chicago to develop Nightshade, a program meant to confuse AI image generation models into thinking your image is something entirely different. To quote from their website, “Nightshade, a tool that turns any image into a data sample that is unsuitable for model training. More precisely, Nightshade transforms images into “poison” samples, so that models training on them without consent will see their models learn unpredictable behaviors that deviate from expected norms, e.g. a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space.”
This groundbreaking tool is the artist’s only saving grace to protecting their images other than another similar program called Glaze, however this one uses an offensive rather than defensive approach. This program actively breaks the AI scrape programs that try to take pieces just from the internet without people’s consent, and makes them fundamentally not understand what they’re looking at. all while keeping the original image mostly unchanged and still understandable to human eyes. Again, to quote from their website, this is how the process works:
“Like Glaze, Nightshade is computed as a multi-objective optimization that minimizes visible changes to the original image. While human eyes see a shaded image that is largely unchanged from the original, the AI model sees a dramatically different composition in the image. For example, human eyes might see a shaded image of a cow in a green field largely unchanged, but an AI model might see a large leather purse lying in the grass. Trained on a sufficient number of shaded images that include a cow, a model will become increasingly convinced cows have nice brown leathery handles and smooth side pockets with a zipper, and perhaps a lovely brand logo.”
However, some detractors of the program claim that AI image generation models have already moved past the filter, and can still understand these images even with it applied. This is simply not true, and it was something taken out of context from a faulty news source. The program still works in its current state, although many artists understand that it will not work forever. Many artists simply use it to send the message that they don’t want their works to be used for generation programs. When OpenAI was asked about how they feel about such programs being used to protect pieces of art, the company said that they take attacks against their systems “very seriously” and “are always working on how we can make our systems more robust against this type of abuse.”
The only actual problem that using such a filter provides is ironically marking any real piece of art with the filter applied as AI generated. Because of the way the filter works, not only does it protect against artificial intelligence models, but it also makes models looking for And I generated art think that these real pieces are also Ai generated, and flag them. This might send the wrong message. Certain artists who really did put in the time and effort to make their art pieces could get flagged for AI generation, and could even have their pieces removed in certain spaces even though they did nothing wrong. This is because the way the Filter Works makes these detection tools wear up, because it has a lot of the same tells that artificial intelligence models art pieces due to, even though the artwork itself visually doesn’t look any different.
Overall, there are clearly some issues to still iron out with the program, like the aforementioned one above. However, for many artists, benefits far outweigh the downsides especially considering their livelihoods and their art pieces are very much on the line.
No comment yet, add your voice below!