Even though generative AIs like DALL-E and MidJourney are specifically instructed not to do so, they continue to create images that emulate styles of leading artists in a variety of genres. But now, there are new tools available to help artists “fight back” against future AI model training on their copyrighted works.
One such technology, known as Nightshade, was developed by researchers at the University of Chicago and relies on the insertion of pixels that humans can’t see but causes unexpected outputs—a novel use of steganography.
When someone prompts a generative AI–one that was trained on images already protected with Nightshade–the resulting image will be completely different from the style of the artist mentioned in the prompt, subtly confusing the AI (and, inevitably, disappointing the user).
I’ve discussed this technology with members of Spectrum’s creative team and most support it at face value (ie, protecting artists’ original work) as long as it doesn’t stifle the creative use of generative AI that are just beginning to emerge, such as fast ideation, spit-balling, and mashups.
When “everything originates from something else” and then becomes something new again through the next artist’s interpretation, the main challenge becomes one of the degrees of derivation. AI art is a new genre in itself; yet originality, not mimicry, should be the standard to which we hold it.
Perspectives
Advertising
Evolving Human Roles in an AI-Driven Healthcare Landscape
Advertising
Be APOL1 Aware This AMKD Awareness Day
Advertising
How AI Agents Are Revolutionizing Healthcare
Advertising ,
Spectrum Science
Spectrum Is the Agency of the Future and Here’s Why