NEWS
LIFESTYLE
FUNNY
WHOLESOME
INSPIRING
ANIMALS
RELATIONSHIPS
PARENTING
WORK
SCIENCE AND NATURE
About Us Contact Us Privacy Policy
© GOOD Worldwide Inc. All Rights Reserved.

Professor develops tool named 'Nightshade' to safeguard artists by poisoning AI image generators

He has come up with a tool called 'Nightshade' to manipulate AI image-generators who illegally access artists' work by poisoning the AI set and infecting it.

Professor develops tool named 'Nightshade' to safeguard artists by poisoning AI image generators
Representative Cover Image Source: (L) Pexels| Valeriia Miller (R) Pexels| Pavel Danilyuk

While AI has become an increasingly developing advancement with an ability to create superior innovations, it is also an underlying threat to several artists. Many have been facing the issue of copyright infringement where artists’ original works have been stolen or used slyly by AI  generators, thereby causing artists to lose their credit and work. To tackle this, My Modern Met shared that a scientist has come up with a tool called "Nightshade" to poison such AI generators. The article mentioned that the name was given from a plant called Nightshade which was historically used to poison kings and emperors. The tool was developed by computer science professor, Ben Zhao, from the University of Chicago.



 

The brilliance of this tool is that it holds the ability to destroy AI generators. All artists have to do is inject the invisible pixel into their artwork and as soon as the same is uploaded to an AI set, it will cause it to break. While the tool is still under review, it is a breath of relief for many artists whose works are being stolen and used by companies and others without their knowledge or permission. Simply put, the tool has the power to disrupt the technology of the AI image-generators. The basic idea of the Nightshade tool is to manipulate the AI models in a way that forces the latter to produce unusual and false results. The aim is to make the models useless in their ability to copy and generate results.



 

The article also reports that once the infection finds its way into the models, it is tough to get rid of them and is likely to affect the system. This adds to the prevalent efforts of saving artists’ originality and ensuring that their work is not recreated in vain by fraudulent companies. A recent tweet from @spawning shared that over 78 million artworks that were being used by AI sets were opted out by artists. MIT Technology Review reported that Zhao has also created another tool called “Glaze” which adds to the protection and safety artists and their creations have. It allows creators to mask their styles so that AI sets are unable to identify and copy them. The article also mentioned that the tool is designed in a similar way to that of the Nightshade tool.



 

It works in the form of pixels that manipulate machines and AI sets heavily, thereby displaying illusions rather than the actual object or elements. While the new tool doesn’t cover the damages already done, it gives hope for artists to fearlessly create and share with the world more and more of their talent. Zhao replied to a tweet clarifying, “Nightshade’s purpose is not to break models. It’s to disincentivize unauthorized data training and encourage legit licensed content for training.” @AnalyticsDrift shared a tweet giving a rough idea of what Nightshade’s potential is and why it is the supporting hero for artists who take time and effort to create their masterpieces. @anaisisreading shared a phrase quoting Zhao that read, “We created Nightshade because right now, AI companies hold all the cards—and we need to tip the power balance back in favor of artists.”



 

More Stories on Scoop